Search for:
e r diagram cardinality and participation
E-R Diagram Cardinality and Participation

Cardinality and participation explained for E-R diagrams.

  • How to show cardinality/participation for several methodologies.
  • Examples of notation and diagrams.

In my previous posts, I discussed some E-R diagram basics and some common mistakes to avoid. In this post, I’ll cover cardinality and participation. Cardinality is a count of the number of times one entity can (or must) be associated with each occurrence of another entity. Participation refers to whether an entity must participate in a relationship with another entity to exist.

Cardinality and Participation Constraints in E-R Diagrams

In general, cardinality tells you “How many”. Cardinality can be:

  • One to one (1:1): every time one entity occurs, there is exactly one occurrence of another entity.
  • One to many (1:m):  every time one entity occurs, there are multiple occurrences of another entity.
  • Many to many (m:m): for each occurrence of an entity, there can be one or many occurrences of another—an vice versa.

In some notations, a cardinality constraint corresponds to maximum cardinality. In other notations, cardinality may be combined with participation (a “minimum”).

Participation can be total or partial (optional):

  • Total participation is where an entity must participate in a relationship to exist. For example, an employee must work for at least one department to exist as an “employee”.
  • Partial (optional) participation is where the entity can exist without participating in a relationship with another entity [1]. For example, the entity “course” may exist within an organization, even though it has no current students.

How to Show Cardinality and Participationon an E-R Diagram

While there are many ways to create E-R diagrams, a straightforward way is to create a rough draft of your ERD first, then tackle cardinality. While most methodologies (e.g., Chen, I.E, min/max) use the same shapes for entities (rectangles) and relationships (diamonds), when it comes to cardinality each has its own specific notation. This is where you must make a choice about what methodology to use, as you must stay consistent when building cardinality constraints.

One of the most important choices is between Look-Here and Look-Across methods. Look-Here and Look-Across refers to where the cardinality and participation constraints are specified in ER diagrams.

With the “Look-Here” cardinality constraint, you’re literally looking “here” (i.e. next to) the entity to determine cardinality.

In the above example using the min/max method, the upper bound cardinality constraint 1 states that each employee may have the relationship “work for” at most 1 time, meaning they can only work for one department. The lower bound cardinality constraint 1 states that each employee must have at least 1 appearance in the “works for” relationship. On the other hand, there are no restrictions on how many employees a department may have; the upper bound N indicated no limit [2].

With the “Look-Across” constraint, you must look to the other side of the relationship to garner meaning:

 

Different Methodologies and Ways to Show Cardinality

The following examples all show how different methodologies show cardinality and participation:  

 

Crowsfeet is one of the most popular methods for creating E-R diagrams. With crowsfeet notation, cardinality is represented by decorations on the ends of lines.

A cardinality of one is represented by a straight line perpendicular to the relationship line.

────┼   

A cardinality of many is usually represented by the three-pronged ‘crow-foot’ symbol, but may also be represented by a two-pronged symbol [2]:

────<    many

Other basic symbols:

───┼<    one or many

──o─<    zero or many

───┼┼    exactly one.

The IE method is very similar to crowsfeet but does not show attributes related to a relationship: the relationship is depicted as a named line between two entities. Cardinality and participation constraints are combined into min/max (bar and crowfoot) notation. Cardinality is shown as follows [1]:

  • One and only one: Two bars at end of line or single bar
  • Zero or one: Hollow dot and one bar
  • One or more: One bar and crowfoot
  • Zero, one or more: Hollow dot and crowfoot
  • More than one: Crowfoot.

Your choice of which methodology to use is probably going to be determined by your company’s preference (or perhaps your instructor’s). Unfortunately, differences between notations make the diagrams challenging to transfer from one format to another [1].  

References

Images: By Author

[1] A Comparative Analysis of Entity-Relationship Diagrams1

[2] Entity Relationship Modelling

Source Prolead brokers usa

doing python math operations with numpy part iii
Doing Python Math Operations With Numpy (Part III)

In my previous post, I explored how to use Pandas to work with data frames and similar structures. In this post, I want to go to the next level and discuss the magical operations available with the NumPy (Numerical Python) library, including fast array manipulation.

Numerical Python = NumPy

Why should go with NumPy

  • Provides Data Structure, Algorithm for the Scientific application which requires numerical data.
  • Which supports multi-dimensional array manipulation. NumPy’s array object is called ndarray. 
  • Easy to reshape, slice, and dice the array. And fast array process capability.
  • Makes complex mathematical implementations very simple. 
  • To perform different numerical and trigonometry functions (i.e., sin, cos, tan, mean, median, etc.)
  • Excellent support for Linear Algebra, Fourier Transformer, etc.,
  • NumPy arrays are very efficient than list arrays, The way it processes manipulate id fast.
  • It is often used along with other packages in Python environments like SciPy and Matplotlib. 

What library supports and how should we import NumPy?

import numpy as np

What NumPy Cando? 

The below picture represented the capabilities of NumPy. Let’s discuss it one by one.

I. Exploring the dimensions in the array

a. ONE Dimensions and Multi-Dimensions

(a.1) 1-D

import numpy as np
a = np.array([100,200,300,400,500])
print (a)
OUTPUT
[100 200 300 400 500]

(a.2) n-D
a = np.array([[100, 200], [300, 400]])
print (a)
OUTPUT
[[100 200]
[300 400]]

b. Number of Dimensions
x = np.array(1)
y = np.array([1, 2, 3, 4, 5])
z = np.array([[1, 2, 3], [4, 5, 6]])
print(x.ndim)
print(y.ndim)
print(z.ndim)
OUTPUT
0
1
2

c. Finding Type of the array
arr = np.array([1, 2, 3, 4, 5])
print(arr)
print(type(arr))
OUTPUT
[1 2 3 4 5]
<class ‘numpy.ndarray’>

d. Accessing array elements
arr = np.array([1, 2, 3, 4])
print(arr[0])
OUTPUT
1

e.Slicing array element
import numpy as np
arr = np.array([1, 2, 3, 4, 5, 6, 7])
print(arr[1:5])
OUTPUT
[2 3 4 5]

II. Data Type Objects Specfic to NumPy

NumPy has additional data types, Let’s see those and simple code to find the type of the variable.

III. Finding shape and re-shape of the ndarray.

a. Shape of the array

(a.1) The Array has the attribute that returns array dimensions.

import numpy as np
a = np.array([[1,2,3],[4,5,6],[4,5,6]])
print (“Shape of the array:”,a.shape)
a = np.array([[1,2,3,4],[3,4,5,6]])
print (“Shape of the array:”,a.shape)
OUTPUT
Shape of the array: (3, 3)
Shape of the array: (2, 4)

b. Re-shape of the array

(b.1) Certainly, you can resize the array, with simple steps

import numpy as np
a = np.array([[1,2,3],[4,5,6]])
b = a.reshape(3,2)
print (“Array a(Actual shape):\n”,a)
print (“Shape of the array:\n”,a.shape)
print (“Array b(After reshape):\n”,b)
print (“After Re-shape of the array:\n”,b.shape)

OUTPUT
Array a(Actual shape):
[[1 2 3]
[4 5 6]]
Shape of the array:
(2, 3)
Array b(After reshape):
[[1 2]
[3 4]
[5 6]]
After Re-shape of the array:
(3, 2)

IV. Converting List and Tuple into ndarray

(a) List into Array

import numpy as np
x = [1,2,3]
print(“List:”,x)
a = np.asarray(x)
print(“As Array:”,a)

OUTPUT

List: [1, 2, 3]
As Array: [1 2 3]

(b) Tuple into Array

import numpy as np
x = (1,2,3)
print(“Tuple:”,x)
a = np.asarray(x)
print(“As Array:”,a)

OUTPUT

Tuple: (1, 2, 3)
As Array: [1 2 3]

V. Array Join and Split

(a) Join Array

This is similar to SQL joining two or more arrays into a single array using concatenate function

import numpy as np
arr1 = np.array([1, 2, 3])
arr2 = np.array([4, 5, 6])
print(arr1)
print(arr2)
arr = np.concatenate((arr1, arr2))
print(“After concatenate :”,arr)

OUTPUT

[1 2 3]
[4 5 6]
After concatenate : [1 2 3 4 5 6]

(b) Splitting Array

Splitting is the opposite operation of Joining, It breaks one array into multiple. using array_split()

import numpy as np
arr = np.array([1, 2, 3, 4, 5, 6,7,8])
print(“Actual Array:”,arr)
newarr = np.array_split(arr, 4)
print(“Split:”,newarr)

OUTPUT 

Actual Array: [1 2 3 4 5 6 7 8]
Split: [array([1, 2]), array([3, 4]), array([5, 6]), array([7, 8])]

VI. Create an Array with Ranges

Start-value, Stop-value, Step-value

import numpy as np 
x = np.arange(10,50,5)
print (x)

OUTPUT

[10 15 20 25 30 35 40 45]

VII. Array Multiplication

This simple multiplication provided Row and Column counts are equal. This concept is so-called broadcasting. it has limitations 

  • Arrays should have exactly the same shape.
  • Arrays have the same number of dimensions and the length of each dimension.

a = np.array([1,2,3,4])
b = np.array([5,5,5,5])
print (a)
print (b)
c = a * b
print (c)

OUTPUT

[1 2 3 4]
[5 5 5 5]
[ 5 10 15 20]

As mentioned earlier, NumPy supports a number of mathematical operations to handling numbers.

  • Arithmetic Operations
  • Statistical Functions
  • Trigonometric Functions
  • Linear Algebra

(a) Arithmetic Operations

This is very straightforward operations, as we are very familiar with add(), subtract(), multiply(), and divide(). the array should have the same shape or should conform to array broadcasting rules.

a = np.array([2,2,2])
b = np.array([10,10,10])
print (‘First array:’,a)
print (‘Second array:’ ,b)
print (‘Add the two arrays:’,np.add(a,b))
print (‘Subtract the two arrays:’,np.subtract(a,b))
print (‘Multiply the two arrays:’ ,np.multiply(a,b))
print (‘Divide the two arrays:’ ,np.divide(a,b))

OUTPUT

First array: [2 2 2]
Second array: [10 10 10]
Add the two arrays: [12 12 12]
Subtract the two arrays: [-8 -8 -8]
Multiply the two arrays: [20 20 20]
Divide the two arrays: [0.2 0.2 0.2]

(b) Statistical Functions

NumPy has very useful statistical functions.

  • Minimum
  • Maximum
  • Percentile
  • Standard Deviation
  • Variance

import numpy as np
a = np.array([[3,7,5],[8,4,3],[2,4,9]])
print (‘Given array is:’)
print (a)
print (“Minimum :”,np.amax(a))
print (“Maximum :”,np.amin(a))
print (“Standard Deviation :”,np.std(a))
print (“Variance :”,np.var(a))
print(“Percentile :”,np.percentile(a, 4))

OUTPUT

Given array is:
[[3 7 5]
[8 4 3]
[2 4 9]]
Minimum : 9
Maximum : 2
Standard Deviation : 2.309401076758503
Variance : 5.333333333333333
Percentile : 2.32

(c) Linear Algebra

NumPy package contains numpy.linalg library, it provides linear algebra functions

import numpy.matlib
import numpy as np

a = np.array([[1,2],[3,4]])
b = np.array([[11,12],[13,14]])
print (“Dot Operation:”, np.dot(a,b))

a = np.array([[1,2],[3,4]])
b = np.array([[11,12],[13,14]])
print (“vdot :”,np.vdot(a,b))

OUTPUT

Dot Operation: [[37 40]
[85 92]]
vdot : 130

IX. NumPy with Matplotlib

As mentioned earlier, Numpy will work along with the Matplotlib library and helping us to create various charts. Quickly will see those. a really interesting piece of work.

import numpy as np
from matplotlib import pyplot as plt
x = np.arange(1,11)
#simple linear equation
y = 2 * x + 5
plt.title(“Matplotlib demo”)
plt.xlabel(“height”)
plt.ylabel(“weight”)
plt.plot(x,y)
plt.show()

OUTPUT

import numpy as np
import matplotlib.pyplot as plt
x = np.arange(0, 4 * np.pi, 0.2)
y = np.sin(x)
plt.title(“Sine Wave Form”)
plt.plot(x, y)
plt.show()

OUTPUT

from matplotlib import pyplot as plt
x = [5,15,10]
y = [12,16,18]
x2 = [6,9,14]
y2 = [6,15,7]
plt.bar(x, y, align = ‘center’)
plt.bar(x2, y2, align = ‘center’)
plt.ylabel(‘Performance’)
plt.xlabel(‘Slots’)
plt.show()

X. Working with a String array.

NumPy provides various options to play around with string.

import numpy as np
print (“Capitalize (hello world):”,np.char.capitalize(“hello world”))
print (“Title (hello how are you?):”,np.char.title(‘hello how are you?’))
print (“Lower (HELLO WORLD):”,np.char.lower([‘HELLO WORLD’]))
print (“Upper (hellow):”,np.char.upper(‘hellow’))
print (“Split (hello how are you?):”,np.char.split (‘hello how are you?’))
print (“Split (Python,Pandas,NumPy):”,np.char.split (‘Python,Pandas,NumPy’, sep = ‘,’))
print (“Strip (welcome watts):”,np.char.strip(‘welcome watts’,’w’))
print (“Join (dmy):”,np.char.join(‘:’,’dmy’))
print (“Join (dmy):”,np.char.join([‘:’,’-‘],[‘dmy’,’ymd’]))
print (“Replace (Python is a programming language):”,np.char.replace (‘Python is a programming language’, ‘programming’, ‘powerful programming’))
print (‘Concatenate two strings Hello,Mr.Babu:’,np.char.add([‘Hello’],[‘Mr.Babu’]))
print (‘Concatenation example [Hello, Hi],[ Shantha , Babu]:’,np.char.add([‘Hello’, ‘Hi’],[‘ Shantha ‘, ‘ Babu’]))

OUTPUT

Capitalize (hello world): Hello world
Title (hello how are you?): Hello How Are You?
Lower (HELLO WORLD): [‘hello world’]
Upper (hellow): HELLOW
Split (hello how are you?): [‘hello’, ‘how’, ‘are’, ‘you?’]
Split (Python,Pandas,NumPy): [‘Python’, ‘Pandas’, ‘NumPy’]
Strip (welcome watts): elcome watts
Join (dmy): d:m:y
Join (dmy): [‘d:m:y’ ‘y-m-d’]
Replace (Python is a programming language): Python is a powerful programming language
Concatenate two strings Hello, Mr.Babu: [‘Hello Mr.Babu’]
Concatenation example [Hello, Hi],[ Shantha , Babu]: [‘Hello Shantha ‘ ‘Hi Babu’]

Hope you all are enjoyed NumPy and its capabilities. Still many more… I have covered whichever is more important and mainly used during Data Science and Machine Learning implementation while dealing with datasets. 

Thanks for your time in reading this article. Leave your comments. Shortly will get back to you with interesting topics. 

Source Prolead brokers usa

how artificial intelligence is revolutionizing mental healthcare
How Artificial Intelligence is Revolutionizing Mental Healthcare

Science fiction author William Gibson has been attributed as saying, 

“The future is already here, the fact is it’s not just very evenly distributed.”. 

Revolutionary artificial intelligence algorithms are creeping into mental healthcare and are reshaping its dimensions. You might already be discussing with an AI bot right now that “how does it make you feel to hear that?” Your AI therapist might be successful enough to take you out from the feeling of worry about what direction the future will take with the advent of artificial intelligence. In case, looking beyond the horrifying headlines of Skynet coming true, the progressive utilization of artificial intelligence in mental healthcare is absolutely splendid news for many of us. 

Depression is a global problem with serious consequences. Almost 1 out of 5 Americans deal with mental health conditions at some point in their lives. Yet, in numerous cases, we are relying on individuals to seek treatment, despite the continuing stigma against asking for health for mental health struggles. Almost 40% of the world’s population lives in such areas where there are not enough mental healthcare professionals to meet the requirements of the community. Currently, we are facing a severe mental health crisis aggravated by the pandemic outbreak.

Thanks to the rapid technological innovations and advancements due to which AI algorithms are capable of spearheading a positive transformation in the space that has been a long journey for change. Here is the list of five optimistic ways artificial intelligence and its innovative algorithms are revolutionizing mental healthcare in the best way possible. 

#1  Helping to Characterize Mental Health Issues 

Making mental health treatment and diagnosis less subjective and more quantifiable helps to destigmatize such conditions and enhance outcomes. No blood test is required for mental health conditions. Artificial intelligence and promising machine learning algorithms prove to be equivalent research-based objective tests that make the requirement to seek treatment more about evidence-based data along with best medical practices, and less about the patient’s subjective experience of distress. 

Democratizing access to mental health treatment will certainly mainstream this kind of healthcare. The more people who seek treatment for mental health issues or who know someone who is suffering from mental health struggles, the less mental health treatment will feel like a secret or a shame. 

#2   Making Support Available Anytime Anywhere

Mobile devices are in the hands of literally every individual. Applications and chatbots are accessible to everyone no matter where they are. Undoubtedly, that’s the most affordable treatment option.   They are always awake and on-call. According to researchers and analysts, most people feel comfortable sharing their feelings with an anonymous chatbot as compared to a human being.  

Obviously, such tools are still new and experimental. There would be no wrong in saying that chatbots and other existing app-based tools for tracking and enhancing mood prove to be tremendously beneficial mainly for those patients who usually struggle to access care. There already exists a wide range of mobile-based tools that make patients walk through exercises that are based on cognitive behavioral therapy, as well as other research-based techniques. Hence chatbots are lifelines for those who are suffering from depression and anxiety in the middle of the night. 

#3   Flagging Early Warning Signs of Dangerous Troubles 

Just think what would it be like if your smartphones were smart enough to notify your doctor that you are at risk of depression on the basis of how often you leave your house or how fast you are typing. According to a certain study, language analysis algorithms were 100% accurate at identifying teens who were likely to develop psychosis. Such tools already exist and without a doubt, they are really very powerful. 

Language analysis is utilized to monitor patients who are going through certain treatments and warn doctors when they take a turn for the worse. Most patients do not visit their doctor or therapist regularly, but answering a few questions online on a day-to-day basis helps the app to detect early signs of trouble. AI tools provide invaluable support to human providers and patients, establishing daily checkpoints that can catch a downturn before it turns into a serious spiral.

#4   Reducing Bias and Human Error 

Artificial intelligence algorithms have proven to be successful at detecting signs of certain conditions such as anxiety, depression, and post-traumatic stress disorder with the analysis of facial expressions and voice patterns. Physical and mental health providers are utilizing such AI-powered tools to serve as a backup during the meeting with patients. These meetings are usually very brief. 

Such tools are beneficial when healthcare providers, who are usually rushing from appointment to appointment, may not notice what signs of trouble the patient exhibits. Moreover, AI tools help in reminding a busy healthcare physician to push past that surface-level appearance and dive into those problems that are not acute yet. 

#5   Integrating Mental Health Care with Physical Healthcare

Can you think of a future where machine learning algorithms can warn surgeons and doctors that a patient is at risk based on its existing medical record?  According to one study, it is successfully predicted that which of the patients who were brought into the hospital because of self-injuries will likely give a suicidal attempt anytime soon. 

For instance, when it boils down to the opioid crisis, according to data, 10% of the patients who use opioids for the next 90 days after their surgery will continue to depend on those medications. That patient could be referred to the therapist who is a medication tapering specialist so that patients can widen the array of techniques they utilize to manage their pain and other symptoms. 

Health is the Greatest Wealth 

Mental health plays a pivotal role in our lives. Anxiety, depression, and stress are caused when people prefer to please others instead of pleasing themselves. The biggest stigma with mental health is most people still don’t talk or fear to talk about it, although it’s a great wealth. 

People fear that AI bots will soon replace human therapists, but that’s not true. Instead, artificial intelligence will support human therapists. AI can serve as an early warning system providing support all day every day. It can offer lifelines to those who live in rural areas where mental health support is difficult to find. It can also prove to be beneficial for those who cannot afford to visit therapists regularly. Without a doubt, mental healthcare is an enormous challenge in our society, and the COVID-19 outbreak is the only increasing support to this challenge.  

At the heart of it all, artificial intelligence has the potential to revolutionize mental healthcare, making care more affordable, responsive, and accessible. In the approaching years, AI algorithms will be our first line of defense against mental health struggles that can be deliberating for a tremendous amount of people.  

Source Prolead brokers usa

information communication technology ensures a safe social media ecosphere
Information Communication Technology Ensures a Safe Social Media Ecosphere

Overview – Information Communication Technology (ICT)

Information and Communication Technology (ICT) Essentially, technology may be described as an electrical medium for generating, preserving, altering obtaining, and transmitting information from one location to add in its most basic form. It facilitates the transmission of messages by making them more accessible, accessible, understandable, and interpretable. This is accomplished via the use of technological devices such as mobile telephones, the Internet and wireless networks, computers, radios, televisions, spacecraft, and cell towers, among others. These elements are used in the creation, storage, communication, transmission, and management of data. 

Technology is a critical motivator in empowering people, communities, and nations, facilitating growth and enhancing skills, all while maximizing the benefits of the democracy dividends for everyone.

Information technologies have made it feasible for companies to operate around the clock, from anywhere in the world. This implies that a company may be open at any time and from any location, making international transactions simpler and more convenient. You may also have your items delivered to your door without having to move a single muscle, which is a huge convenience.

In so many different economic areas, there is a fresh chance for additional study to enhance certification. A degree may be done entirely online from the comfort of one’s own home. It is feasible to hold down a job while pursuing a degree.

Importance of Information and Communication Technology (ICT)

The widespread popularity of technology has created uncertainty in the management of many companies as a result of their expanding use. The management and support of these multifaceted environments – consisting of a diverse range of PCs, desktops, and laptops, as well as mobile and wireless equipment, printers, connectivity, and implementations – has proved difficult and costlier for information technology government agencies to manage and support over time.

  1. It develops in students an analytical mindset that allows them to analyze and provide answers to issues arising in all connected areas that make use of it as a learning tool, among other things.
  2. Because it is a new academic area of study, it encourages students to be creative and to come up with new scientific approaches to problem resolution.
  3. It simplifies the procedure of storing and retrieving data.
  4. It contributes to the development of computer interacting, which is now called the internet and intranet.
  5. It comes up with the potential to stimulate economic growth at a national level since it is a reliable source of national revenue for all countries which has properly recognized its value.
  6. It generates meaningful work, thus providing a sustainable source of income.
  7. It makes it easier to comprehend other topics as a result. Almost all principles of development, such as the use of a projection in the classroom, are accessible to the use of information and communications technology.
  8. In this way, it serves as a platform for the exchange of ideas and innovations amongst information technology academics both locally and globally.
  1. It serves as the foundation for e-learning and the creation of an online library. As a result, disseminating information is now simpler than ever.
  2. It plays an important role in globalization in every appearance and the achievement of the Millennium Development Goals.
  3. it is utilized in a variety of offices to ensure which official actions and organizations are properly documented.

The technology of Info and Communication towards the social and commercial spheres,

There is no disputing that, from the beginning of social media and social systems have developed to be a part of our everyday existence, the whole thing has changed. Starting with the means we mingle, engage, arrange events, go outing, we can make a difference. We will not engage in a discussion regarding the principled implications of the technique of social media is affecting everyone’s lives at this time. It is proposed that we instead concentrate on many methods social media is altering the technique our educational system operates in this post. So, watch this space to find out what impact social networking has on the way our kids are taught, both inside and external of the classroom.

Technology advancements have always been utilized by for-profit organizations to enhance their income. Government agencies and non-governmental organizations (NGOs) have, on the other hand, failed to effectively use them for social benefit. The social company, a new kind of enterprise that is developing, is working to close the generation gap.

The Internet of Things (IoT) is playing an increasingly important role in the creation and growth of social enterprises. It is impossible to overstate the importance of information and communications technology infrastructure for social enterprises. It has made social impact cheaper and more scalable, and it has opened the door to new methods of connecting with and engaging with local communities (a key characteristic of the social business) Social media websites and apps enable users to produce and share user-generated content on the internet. Social media websites and applications are becoming more popular.

Businesses may now react to consumer inquiries more quickly than ever before thanks to new channels such as social media and instant chat. Doctors and nurses may assist healthcare organizations through video-conferencing systems. The possibilities for connection are almost limitless.

Furthermore, the use of information and communications technology (ICT) in corporate communications is enhancing the way that businesses cooperate. Everything from displays and files to videos may now be shared online, including documents and presentations. This guarantees that organizations may collaborate more efficiently, regardless of what they’re located.

 

Final words

Embracing communication technology in businesses and other areas

Eventually, ICT has an enormous influence on the business scenery nowadays. It does not matter if your business is small or big and what type of industry it is So there is no question that you’re using a type of information and networking technology platform. A good company growth plan, such as other types is critical to attaining outstanding outcomes. Getting a proper approach and plan in place is essential to obtaining outstanding achievements

Business leader’s requirement to contemplate prudently regarding the delinquent which they wish to resolve with its information communication technology solutions and work rearward through there. Identifying the problems which you wish to resolve can aid them in the selection of the most appropriate contractor to engage with. Once you’ve established your strategy, you may start developing a method for improved information sharing which is tailored to your organization’s needs.

Source Prolead brokers usa

the essential role of automation in end user computing
The Essential Role of Automation in End-User Computing

The world over, two pragmatic ideas are taking root. Businesses want employees to spend less time tinkering with their devices instead of being productive. They also want their IT teams to invest less time fixing PCs and more time providing value.

Kulvinder Dosanjh (Kully), Group Director IT Service Delivery and Business Information Security Officer at BMI Group, explains this from his experience in the new normal. His organization, a leader in the roofing and waterproofing segment, faces a routine Monday morning productivity loss familiar to many businesses: Service Desk tickets balloon, employees wait idle or are handicapped while their systems are fixed; the Service Desk spends time taking remedial action and hundreds of productive hours are lost across the organization.

The Work from Home trend gaining traction has compounded the problem. Employees are struggling with new processes, applications, and security protocols. They expect plenty of hand-holding from IT. Kully‘s organization is solving this by deploying automation and AI in end-user computing environments.

Kully sees the automated world delivering a different perspective to both business users and IT support teams. He strongly believes automation is and will continue to change the way IT support works and business users get support. With the intelligence built around end-user devices, huge amounts of system data get generated —which no human can practically go through— this data gets automatically captured, digested, and parsed. An AI layer is then used to create insights at a device level. These trigger self-heal processes, self-service recommendations, chatbot assistance, and a task list for the IT team.

“What this does is simple,” says Kully, “It proactively prevents incidents that affect productivity. It provides the IT team with tasks to follow up instead of having to take calls or emails from users.” Now, employees don’t wake up on Monday—or any other day—with the question, “What is happening and when can it be fixed?” 

While a customer-focused process is better than an SLA-focused process, security has become a major consideration in the new world of accessing data from the outside. The old world of connecting to data and systems from inside to outside has shifted 180 degrees to access data from outside in– especially after the pandemic has affected global business. And with employees doing their own thing, enterprise systems and networks are more vulnerable. However, security systems, processes, and protocols can’t be made so complex that they inconvenience employees and hinder productivity.

Sujoy Chatterjee, Vice President, Infrastructure Services at ITC Infotech believes that a balance must be struck between the levels of security and the flexibility available to employees to use technology. To decide the right level of security, the business must be involved. At BMI, a central team uses a risk-based approach and ensures that decisions don’t impact business. As a matter of abundant caution, processes are in place to mitigate the impact of any breach in security.

“The goal,” says Kully, “Is to bring data back to business—without which, we may have no business!” The takeaway from the BMI experience can be boiled down to two essentials: One, leverage AI to make data more effective and improve productivity/ efficiency by empowering users and making their lives easy and two, do not overdo security to the point where it has the potential to paralyze business.

Author:

Manoj Kumar

Capability Head, Digital workplace,

ITC Infotech

Source Prolead brokers usa

data modernization the key to tomorrows highly competitive insurance industry
Data modernization: The key to tomorrow’s highly competitive insurance industry

One question haunts every CXO: “How can we make our company and products better for the customer?” The answer, today, is straightforward. Data is what makes and breaks organizations. Data allows organizations to look back with surgical precision on their past actions and allows them to look forward with confidence to remodel the business in near-runtime. For the insurance industry, data has extraordinary significance. “It allows us to tailor services for customers instead of having a set of products you try and sell,” says Paul Johnson, CIO, and COO of PIB Group (an ITC Infotech customer), which is a dynamic and diversified specialist insurance intermediary that provides bespoke solutions for personal and business customers. Johnson is mindful of the fact that in a service-oriented environment, data provides the means to know the customer accurately and improve customer experience, boosts organizational efficiency, meets compliance requirements, and understands how markets may be shifting. Despite the upside, why is selling a modern data vision to the Board so difficult?

The reason Boards are wary of CIOs pushing ambitious data agendas is that they are never shown reasonable ROI. In the experience of leaders who have driven successful data modernization programs, the board needs to see two outcomes of data modernization:

ROI—which is not necessarily in terms of monetary impact—is possible in three years. For most insurance companies, legacy databases are, by nature, reporting systems. Boards need to be sensitized to the fact that modernizing data is preferable to overspending on a reporting database as the investment helps shape business strategy through insights. Data modernization must be seen strategically.

Quick wins that allow the board to buy into the data modernization journey. This then allows the organization to do all the other exciting things it can to make business better, including revenue uplift, improve operational efficiency, and ways to serve customers in different and more personalized ways for more effective upsell and cross-sell.

Five key shifts make investments in data modernization an industry imperative:

  • Direct, digital, and embedded sales will become dominant channels for growth, which directly enables cross-sell and upsell insurance
  • The subscription revolution will see insurance deeply woven into consumers’ everyday life
  • Ecosystems will expand as the cloud and new connections enable radical innovation
  • Real-time risk visibility and responsiveness will become a reality. Huge savings due to faster processing and elimination of fraudulent transactions
  • AI adoption will accelerate change

A simple scenario helps contextualizes the shifts. Imagine a customer asking for motor insurance. The right way to tailor the insurance is to seek data on car usage and build an AI-driven risk profile of the customer. Based on this, insurance is offered in a subscription model to fit customer needs.

While there are several challenges to modernizing data related to scalable architecture and questions around trust, security, and governance, all can be solved by leveraging an intelligent cloud. But it pays to pay close attention to Johnson’s words of wisdom that come from experience: Don’t bite off more than you can chew—don’t promise the world, promise what you can deliver; get IT and business to collaborate so that a business understanding drives data modernization; and don’t rush it—if you do, you’ll make mistakes.

Insurance organizations have always understood the value of data. It allows them to build risk models which are the bedrock of business. Now, they need to harness internal and external data and use it in real-time to improve customer experience, raise organizational efficiency, and meet complex regulatory requirements.

Author:

Karthik R

Vice President, Digital Experience

ITC Infotech

Source Prolead brokers usa

how design thinking bookends data science
How Design Thinking Bookends Data Science

Hallelujah brother, I’ve seen the light!

I’m a HUGE believer in the liberating mindset of Design Thinking. For example, Design Thinking not only makes Data Science more effectively, but it helps avoid the devastating unintended consequences of an AI capability that can continuously-learn and adapt at speeds exponentially faster than humans. I’m honored a speaker at the Catalyst Empowerment Summit on August 2. I will present “Blending Design + Data to Scale Innovation” in this lightning-paced event.

In preparation for the event, let me explain how Design + Data scales innovation by exploring how Design Thinking bookends Data Science.

Defining Value.  The role of Design Thinking before engaging Data Science is to clearly and thoroughly define what one is trying to achieve.  This includes clarifying the intended benefits and “values” across a wide range of perspectives (Figure 1).

Figure 1: Defining Value Challenge

There needs to be a wide range of dimensions of value against which the AI model’s effectiveness and accountability will be measured, and it is NOT just financial metrics.  Get this wrong, and the AI models will optimize exactly what you told them to optimize, which may not have been what you intended (see Terminators, ARIIA, and VIKI for examples of AI models optimizing EXACTLY what the value statements asked them to optimize).

Data Science Customer Journey Map.  We modified the Customer Journey Map to help the data science team identify the key decisions, analytics, and data necessary to help stakeholders successfully complete their journeys (Figure 2).

Figure 2: Data Science Customer Journey Map

The Data Science Customer Journey Map identifies the data and analytic requirements necessary for customers to successfully complete their journeys.

Data Science Collaborative Engagement Process.  The role of Design Thinking during the Data Science engagement process is to create a mindset and operating culture of exploration, trying, and learning.  Throughout the Data Science Collaborative Engagement Process, design thinking drives ideation, exploration, and experimentation in tight collaboration with the business stakeholders – where all ideas are worthy of consideration with a “diverging before converging” mindset (Figure 3).

Figure 3: Data Science Engagement Methodology

The Data Science development process is a non-linear “rapid exploration, discovery, training, testing, failing and learning” process, perfect for integrating design thinking capabilities.

Hypothesis Development Canvas.  The Hypothesis Development Canvas ensures the data science effort directly supports the organization’s critical business and operational initiatives. A well-structured Hypothesis Development Canvas, like a well-structured storyboard, provides a concise yet thorough way to make a use case come to life. I spend an entire chapter in my eBook “The Art of Thinking Like a Data Scientist” on how to create the Hypothesis Development Canvas, if you are interested in learning more.

Design Thinking + Data Science.  Design Thinking and Data Science both embrace a curiosity-driven, rapid exploration, rapid testing, continuous experimentation, failure-enabling, continuously-learning and adapting mindset (Figure 4).

Figure 4: Design Thinking and Data Science engagement Similarities

The primary difference between Design Thinking and Data Science engagement methodologies?  One is focused on machine learning, and the other is focused on human learning.  And when you blend that learning together, it will be the ultimate empowerment of an AI-to-human continuously-learning and adapting environment.

From Technology Outputs to Business Outcomes.  The role of Design Thinking after the Data Science engagement is to support a culture of continuous measurement, experimentation, learning, and adapting focused on business and operational outcomes (Figure 5).

Figure 5: Data Science 2.0: From Outputs to Outcomes

Design Thinking techniques can drive the collaborate with the business stakeholders to continuously learn and refine the desired business and operational outcomes that leads to new sources of customer, product, and operational value creation.

Empowerment.  Finally, design thinking guides the empowerment of individuals and teams necessary to nurture and scale innovation (Figure 8).

Figure 6: Keys to Scaling Innovation

The keys to nurturing and scaling innovation are:

  • Key #1: Create a common vision and shared purposed around the organization’s “True North.”  That is, what does your organization seek to accomplish and who do they seek to serve.
  • Key #2: Establish a Common Language (using Design Thinking – the language of your customers) so there is no confusion about what is being said, and a standard engagement “framework” that guides the value identification, capture, and operationalization processes.
  • Key #3: Embrace organizational Improvisation (improv) with the ability and agility to move team members in and out of teams while maintaining operational integrity.
  • Key #4: Create a culture built on transparency and trust that values everyone’s contributions and their unique and inherent assets and capabilities.  Remember, you can’t mandate trust, you must earn it!
  • Key #5: Sharing, Reusing, and Refining Operating Environment.  This is the heart of a learning organization, an organization that a share its learnings (both good and bad) and can reuse and refine what others have learned.  This is the ultimate of standing on each other’s shoulders.

Design Thinking + Data Science scale innovation by identifying, codifying, and operationalizing the sources of customer, product, and operational value creation.  But to create a sustainable innovation environment requires empowering everybody; that everyone feels empowered to contribute and lead.  It is within this culture where the AI-to-human collaboration will drive game-changing innovation and new sources of customer, product, and operational value creation (Figure 7).

Figure 7: Digital Transformation Value Creation Mapping

Source Prolead brokers usa

all models are wrong some are useful e289a0 modeling is a futile exercise
“All models are wrong, some are useful” ≠ Modeling is a futile exercise

The phrase “All models are wrong, some are useful” is quite loosely used. Some take it in a very literal sense to imply that “Modeling is a futile exercise”.

This is a terrible misunderstanding and we shall see why shortly.

“All models are wrong, some are useful” is an aphorism (meaning it is a concise expression of general truth). But the aphorism in this case leads to misinterpretation.

Firstly, it is important to understand what modeling is.

The purpose of modeling is to provide an abstraction of real process. Basically, a good approximation of reality.

Anybody who mistakes the abstraction for the real, commits the Fallacy of Reification (yup, one more fallacy to add to the list of all fallacies which we data scientists/statisticians commit).

Anybody who mistakes the abstraction for the real, commits the Fallacy of Reification

Coming to why the phrase should not be taken literally, a good analogy would be that of an example of a Map.

In an exact sense, a map is also wrong because it does not provide 1:1 mapping of the real world.

So, George Box’s phrase should be construed the same way as a map is considered wrong because it does not represent the real world.

Does a map provide 1:1 mapping to the real world?

No.

But is it mighty useful?

Hell yeah.

Also, talking about utility of the models, John Tukey’s words conveys the essence clearly.

Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question”.

So, to summarize

“All models are wrong, some are useful” ≠ “Modeling is a futile exercise”.

Reference: On Exactitude in science — Jorge Luis Borges

Your comments are welcome. 

You can reach out to me on

Linkedin

Twitter

Source Prolead brokers usa

why you should focus on ediscovery to ensure growth for your firm
Why You Should Focus on EDiscovery to Ensure Growth for Your Firm?

E-Discovery is becoming the number one priority tool for seamless transition day by day. It has significant implications for retaining, storing, and managing the electronic content of organizations.

Businesses of all sizes have to go through litigation at some stage. Most civil and criminal cases are accompanied by electronic discovery requests. And if it is your first legal case, this is the best time to get familiar with eDiscovery.

About eDiscovery

E-discovery is simply an extension of the traditional discovery process, which involves any Electronically Stored Information (ESI) that an organization possesses like email messages, presentations, voicemails, word processing files, tweets, spreadsheets, and all other relevant communication or information that can come in handy in a litigation case. eDiscovery can be extended to any platform where ESI has stored: computers, laptops, servers, tablets, smartphones, and other electronic devices.

The fact is that all the documents these days are stored electronically. Single-plaintiff employment matters, commercial disputes, divorce, personal injury – all type of evidence is now created, collected and converted into electronic form. Emails, word processing, financial data, social media postings are all stored electronically, and even if something is on paper, it is probably printed via a computer file.

In addition, for the minor cases as well, lawyers must know oh to handle ESI in order to keep discovery transparent, responsive, and proportional to fulfill the factual needs of the case.

How does eDiscovery works?

The standard eDiscovery procedure starts when a lawsuit is anticipated. Attorneys who represent litigants from both parties will establish the scope of the eDiscovery request, research the relevant ESI, and put it on legal hold.

Once the request regarding eDiscovery is issued, the litigants must submit all related ESI for collection and analysis. At this point, it will be converted into a PDF or Tiff file for court use.

E-discovery in cloud

E-discovery may present an Information Technology challenge for firms as they have to govern their ESI to be in compliance with legal and other regulatory requirements. However, cloud-managed services in eDiscovery, whether IaaS or as part of a hosted service, can address most of these Information Technology difficulties.

Cloud-based workflow management can automate the complete eDiscovery process and implement more secure, less error-prone eDiscovery compliances and regulations. In addition, the companies also get the extra benefit of reduced data storage costs and data archiving costs.

The accessibility of the data also becomes more efficient as cloud-stored information is readily available when required outside the organization for the purpose of eDiscovery.

 Three key issues that decision-makers should consider:

  • eDiscovery is gaining attention rapidly. However, most corporate decision-makers are unsure that whether they are prepared to make changes or not.
  • Many organizations have encountered some eDiscovery issues focused on email. In addition, an increasing number of data types and venues are further increasing the difficulties of eDiscovery and managing content in general.
  • Rules and regulations regarding ediscovery are continuously evolving, which has burdened decision-makers with additional demands to fulfill with respect to efficient management of eDiscovery.

Key practices firms can follow

There are a variety of practices that an organization can consider for developing an eDiscovery strategy:

  • Focus on the involvement of employees

The policies, procedures, and technologies might be an essential component of a solid eDiscovery strategy. But it is equally necessary to educate the employees, consultants, and other members of the organization regarding the importance of retaining content that can be valuable, making use of corporate communication and collaboration mediums according to the corporate policies, and being cautious about deleting any crucial document. Educating employees can play a significant role in implementing or improving eDiscovery.

  • Ensure that IT and legal work together

To establish a robust eDiscovery capability, it is important that the legal and IT departments collaborate to set up a fundamental approach. E-discovery in the cloud can streamline the review and analysis workflow of legal firms, and the eDiscovery data stored on the cloud can be easily accessed irrespective of location, time, or and device.

  • Develop robust eDiscovery policies

It is necessary to create a procedure of data retention and deletion for types of content. However, many firms do not complete this with adequate urgency even if they undertake this issue. It is crucial for any firm or organization to retain all electronic data that can be utilized for the current and anticipated eDiscovery. Also, other retention requirements include data types like social media.

  • Implement deletion policies

Many firms either over-specify the amount of data that must be retained or do not establish effective data deletion policies that lead to retention of more data than is required, creating unnecessary liability. Along with that, it also leads to increased eDiscovery expenses as more data is retrieved for the purpose of reviewing and also results in more than necessary storage costs.

Therefore, it’s important for an organization that legal and the IT team work together to conduct a review and ensure that everything complies with the regulatory and statutory requirements. Data classification is an essential element here as decision-makers must set up specific parameters regarding what is needed to be retained, what can be deleted, and the positioning method that will be used.

  • Acknowledge the importance of litigation services

Suppose decision-makers believe that litigation is reasonably anticipated. In that case, it is important that the organization immediately starts to examine and preserve all the data that might be considered relevant during the complete duration of the litigation.

For example, a claim for a breach of contract with some contractor will require the retention of emails and other electronic documents that have been exchanged between the employees and contractor. Also, the papers in which the employees talk about the contract or the performance of the contractor must be retained.

A structured and configured eDiscovery and data storing capability will help organizations to immediately place their hold on data when a request is made by a court or regulator or on the advice of legal counsel and retain it for as long as they want to. And if there is a need for an extra hand to execute things efficiently, firms can take the assistance of litigation support services.

  • Implement the correct technologies

At last, it is essential that firms adopt systems with appropriate capabilities like archiving, storage, etc., that will allow the firm to be efficient in the process of eDiscovery. Optimum capabilities ensure that all the relevant information is easily accessible and reviewable at an early stage of a legal case.

In addition, the correct technology platform will allow the classification of data as it is created and then locate content wherever it exists, irrespective of its location.

Key takeaway

The most brilliant way to ensure that your firm is well prepared for electronic discovery requests is to store each and every email sent received via employees. As technology evolves, many more unique issues will present themselves, leading to the implication of more advanced solutions. Today we may feel disrupted by any new technology that, along with promising a better future, brings unexpected issues relating to it.

However, the latest methods of communication will lead to the creation of more potential avenues for discovery. Therefore, the problems which we are observing now won’t be a trouble in the upcoming years.

Source Prolead brokers usa

what is data governance
What is data governance?

Data governance is the process of managing the data’s usability, security, availability, and integrity within an organization using internally set and enforced rules and policies. Effective data governance ensures that data is consistent, accurate, and secure and that it is not misused.

Why does it matter?

Data governance is a must for any organization that seeks to use its data for analysis. It creates an environment where data can thrive as a source of actionable insights that enable the organization to thrive. Without it, data may fail to meet the quality standards required for usable insight extraction or may be exposed to security threats that compromise its integrity, putting the enterprises at risk of legal action.

Data governance improves reliability across all the organization’s businesses thereby making efficient data integration possible. For instance, a distributor’s name may be mentioned differently in the procurement office and the factory’s database. Analysts who have never interacted with the supplier may pose a challenge during data integration.

Governance ensures that there is uniformity and that the analyst does not need to consult the departments generating the data in order to gain an understanding of the data. It goes beyond insight extraction and security by governing who within the organization has access to the data and how they do so. It is therefore important to understand what data each individual member of staff needs before setting the rules or policies of accessing the data internally.

Goals:

1. A key goal of data governance is to break down data silos in an organization.
2. Ensuring proper use of data.
3. Improve data quality
4. Ensuring compliance

Benefits:

  • Data Protection regulations such as GDPR, PCI DSS, and US HIPAA are very strict on how data should be managed. Failure to comply with these laws can lead organizations to incur hefty fines and damaging their reputation. Data governance takes into consideration applying laws early on thereby protecting the organizations’ data.

  • Strong governance ensures that all points of data creation function with data quality as a priority. This leads to an overall improvement of data quality within the organization.

  • Data governance works like an address book for all the data in the organization. This ensures there is no data that is isolated by errors of commission or omission from the overall organization’s rules and policies.

  • The code of conduct and rules established by governance ensure that data management is made easier. It makes it possible for the management of the data’s security and legal compliance.

Strong governance ensures that the data is secure in storage when it’s being accessed and high quality when being created. The governance matters to an organization that intends to use their data for analysis and remain compliant because it ensures consistency, integrity, and security of the data. By improving data quality and proper data use, increases efficiency within the organization and saves a lot of time for the data users.

To successfully implement it in an organization, the data users should be trained on the policies of governance to ensure they understand what will be required of them. DQLabs’ agile data governance tool makes it easier for an organization to set up strong governance by providing a framework that is comprehensive and is guaranteed to increase the organization’s data value as well as ensure compliance.

Source Prolead brokers usa

Pro Lead Brokers USA | Targeted Sales Leads | Pro Lead Brokers USA Skip to content