Search for:
data science applications
Data Science Applications

The advent of the digital age is considered one of the most revolutionary eras in human history. The digital revolution colloquially called the ‘third industrial revolution, started with the shift from analog and mechanical technologies to digital technologies. This shift introduced the field of computer science and its related technologies.

With massive improvements in processing power, storage, cloud technologies, and hardware devices, the digital space has become a vast ecosystem that is ripe for commercialization, with the digital economy generating trillions of dollars each year around the world. Data is shared on the internet is now worth as much as gold or oil. But just like oil, data in its raw form is useless. It’s only worth something after it is processed into a more useful form.

This is where a new army of data scientists and data analysts comes into the picture. They are responsible for creating programs to process this raw data into a more useful form that can be exploited for commercial purposes.

These commercial purposes can be anything, from finding out the most popular flavor of ice cream in an ice-cream shop based on the number of customers who buy them, to plotting out the best design for a road to ensure the minimum possibility of accidents derived with data from previous accidents. We can predict the future based on the past experiences of such events.

WHO IS A DATA SCIENTIST AND DATA ANALYST?

Data Analysts are professionals who take raw data, process them using mathematics and statistical analytics, and present them in such a way that everyone can understand. They provide valuable insights that can be used to improve business practices and increase profits. 

A data scientist on the other hand uses the power of arithmetics, statistics, and sometimes calculus to draw conclusions from raw data. They derive meaning from data and use that to theorize or predict what can happen in the future.

A data analyst simply interprets data in layman’s terms. A data scientist analyzes the data, extracts meaning from it, and draws conclusions that the company/organization can work or improve upon to increase their revenue and their customer experience.

DATA SCIENCE AS A CAREER:

The average salary of a data scientist who is just getting into the field is Rs.500,000 per year, and those with 1 to 4 years of experience can expect that amount to go up to ₹610,811 per year (Source). Career opportunities in the field have exploded in the past few years, with almost all major software companies hiring data analysts and other professionals related to the field to work on various projects. With data science making rapid progress over the past twenty years and expected to significantly impact the lives of everyone on the planet over the next few years, companies are getting aboard to get their share of the pie. Almost all industries, from hospitality to space exploration to even government organs are utilizing the power of data to better conduct their business.

India is the second-largest country, after the United States, in terms of job opportunities for data scientists in the world. There are almost 50,000 jobs generated per year in the field of data science in the country, with career prospects looking plentiful. As you gain experience in the field, your value will only increase exponentially. You can also expect MNCs to post you abroad in Europe or the US with salaries increasing to almost $250,000 per year. 

There is a huge scope for data analysts and scientists in almost all professions today, but it is getting competitive due to the lure of better career prospects in the field. 

TOP 10 DATA SCIENCE APPLICATIONS:

1) HealthCare:

Ever since COVID-19 was declared a pandemic by the World Health Organization (WHO) in March 2020, the data on the cases and infections being released by governments worldwide, along with data released by the WHO and UN, have successfully been used to track and trace the spread of the disease and refining approaches to the way the disease is battled safely.

It has significantly improved contact tracing in countries like South Korea and has helped governments to establish networks to analyse infections and alert citizens about possible virus cases. They have helped in slowing down the spread of the virus.

Another important application has come out in Japan where data science is being used to identify cancerous cells in patients to better fight them without harming the healthy cells.

2) Planning Airline Routes:

Airline industries are one of the most cash-intensive industries in the world. It is also one of the riskiest industries currently in existence, where each error results in life or death instances and profits margins are always thin. 

With fuel prices increasing every day, along with the massive competition that exists, every rupee counts. To ensure the cost of operations of airplanes is kept reasonable, airlines use data science to predict optimal flight paths, weather conditions, flight delays, change seat prices, and the time taken to arrive at the destination hours before the aircraft leaves the runway.

They also use data science to find which aircraft to use based on fuel consumption and passenger occupancy on each flight, and which aircraft to buy in the future.

Furthermore, tare is also used to ensure the best experience of customers and lower the cost of operations and maintenance of crew and aircraft.

3) Weather forecasting and analysis:

The meteorological departments around the world use data analytics to predict storms, floods, and rains that will take place hours or weeks before they happen. These predictions have a huge impact on the economies of coastal areas like fisheries, shipping, and aircraft movement. 

In India, the India Meteorological Department uses data science to predict cyclones and storms in the Bay of Bengal every year during the monsoon season to chart out evacuation procedures and give out warnings in advance. Every year, almost 12 lakh people are evacuated from their homes in Odisha, Andhra Pradesh, and West Bengal during cyclone seasons, resulting in lakhs of lives being saved.

4) Targeted Advertising:

Software and social media giants like Facebook, Google, and Twitter use data science to post targeted ads to users to increase their ad revenues. They use traits of specific users that the advertiser is looking for and record user interactions to various posts to target them with ads based on that data. 

Last year, the money spent by companies for targeted advertisements increased to almost $70 billion worldwide, as they are revealed to increase consumers to click on them 2.68 times more frequently than regular advertising.

With the internet playing a larger role in the lives of everybody, consumption of digital media is only going to increase, and it’s a good bet that targeted ads will too.

5) Banking:

The use of data science in banking is mostly related to security and fraud detection. Transactions that deviate from a standard set of rules can be flagged suspiciously and sent to the supervisors without any human intervention. 

Data science is also being used to prevent money laundering and financing terrorism by blacklisting suspicious transactions and tracing them to report to the authorities.

6) E-Commerce: 

Giant e-commerce companies such as Flipkart and Amazon use data science and data analytics to improve upon the products being displayed to their customers. They try to display the needs of the customer without them having to search for them individually. One big example is the ‘commonly bought together’ feature, which shows the items other customers have usually bought along with the present item the customer has put in their cart.

Amazon tries to build a profile of the customer based on their purchases and recommends items to them based on their history. All this is done using data analytics.

E-commerce is also one of the highest paying fields for data engineers, where starting salaries of good jobs usually start on the higher side from ₹8,00,000.

7) Transport:

The most significant achievement using data science in the field of transportation is the evolution of self-driving or driverless cars/vehicles. With the pairing of data science with the Internet of Things (IoT) technologies, computers are pegged to replace human drivers in cars, trucks, and many other transport mediums. Algorithms have been developed that allow these vehicles to recognize traffic signals, signs, zebra crossings, objects such as traffic cones, and so on. These algorithms also enable the vehicle to learn from them. The job of the data scientist here is to design the algorithms in such a way that the necessary data that is collected is categorized properly to better optimize the performance of the vehicle.

Data science currently is used to predict fuel consumption of vehicles on different terrains and temperatures, along with predicting the behavior of drivers and traffic. Data science is extensively being used to identify obstacles on the road to prevent accidents.

Online cab aggregators like Ola and Uber are using data science to analyze passenger and driver profiles, and better match them to ensure a smooth ride. They also use it to save fuel and time, further increasing profits in a cash-intensive industry.

8) Education:

Data science is being used in education to evaluate student performances and guide them to improve on their learning weaknesses. They’re also being used to pair them with instructors who can specifically help them in academics and non-academics alike, and tailor course material based on the comfort of the student.

Universities also use them to innovate on their curriculum and see which courses are more popular among students to invest in them accordingly.

9) Manufacturing:

Data science is being used extensively in manufacturing to increase throughput and effectively managing raw materials. This can be done by predicting wastage and ensuring an adequate and timely supply of raw materials. Monotonous jobs are being automated at a rapid pace, and data science is being used to improve machine work, particularly those that are involved in precision manufacturing and autonomous robotics.

With improvements in the field of automation and data science, processes such as 3D printing, batch processing, and repetitive manufacturing have seen giant improvements, resulting in an increase in throughput all over the industry.

10) Gaming:

Data science is extensively used to improve game design and gameplay. Complex scenarios and improved interactive gameplay have been the direct result of creating levels using data science.

In online gaming, the company uses data science to create profiles of users and uses them to pair with each other in tournaments. Achievements such as levels crossed and missions completed are used as data to profile them.

CONCLUSION:

In conclusion, data science is a rapidly growing technology that is found in many critical and commercial fields, with no sign of slowing down. Data is said to be the future, and data analysts and scientists will be paid very well as the applications of the technology grow.

Companies and governments are realizing the need to incorporate data science in their work, and are not shying away from paying big bucks to invest in the technology and manpower that drives it!

Source Prolead brokers usa

data collection services assisting organizations to achieve the right business impact
Data Collection Services Assisting Organizations to Achieve the Right Business Impact

The digital era calls for every business process, decision, and action to be fed with analytics. Data-spewing technological innovations have led to abundant availability of data and data analytics has become a staple process across business sizes and verticals. As businesses take every possible pain to collect this data to gain a competitive edge in the industry, data collection companies become an enabler in their quest – pooling, categorizing, and processing data to derive business-critical insights. They assist the organizations in leading successful data-driven initiatives by overcoming three challenges – accumulation, analysis, and action.

The Need for Data Collection Services

Consider the statistic sourced from PwC’s Global Data and Analytics Survey that states that data-driven organizations are three times more likely to report substantial improvement in decision-making. Unless the data is credible and strategically processed, any insight derived out of it will be flawed, costing resources and time. Data collection services enable businesses to obtain relevant data required for sound strategies and business decisions. Cost-efficient collection of accurate and domain-specific data has been a cornerstone of the knowledge economy, the bedrock of firms ranging from aggregator startups to global corporates. Data collection services offer the right approach towards the first step of business intelligence, assisting companies to ace their peers in the industry.

Data-Enabled Use Cases

The data collection companies have the potential to capitalize on what data has to offer. In other words, they have superior technical capabilities to collect, analyze, and visualize the data through automated processes, supported by a highly competent pool of data mining professionals. Cross-functional and agile data management structures allow them to assist client organizations in gaining the right insights, thereby pivoting them in the cut-throat competitive landscape. Three major ways through which data collection providers help client organizations to accentuate the business impact are:

1. Implementing New Business Model

Data-powered implementation of new business models aligns business objectives with the current and forecasted state of demand and supply. Professional data collection services help organizations to expand the company’s portfolio to a wider range, add more value, respect, and credibility supporting the value proposition. The provider can offer relevant data or actionable insights or any other valuable secondary information gauged from data.

2. Customer Experience Strategies

Space-and-shelf optimization, cross and upselling, stock and replenishment optimization, dynamic pricing strategies, and assortment optimization are some of the activities that require substantial customer data. Leveraging insights-driven results can help stakeholders effectively manage such customer-centric activities. Data collection companies provide the required data based on business objectives and processes. Essentially, they facilitate a company to effectively map out their customer experience and offer services accordingly.

3. Streamlining Internal Business Operations

Data-driven insights help in streamlining a company’s internal processes. Supply chain optimization, workforce planning, predictive maintenance, demand planning, and fraud prevention are some of the processes that can be enhanced with the benefit of data. Companies that themselves deal with the collection of massive amounts of data can outsource their primary collection to data collection services, essentially offloading the work for cost-optimization.

From Insights to Action: Converting Data into Business Value

The digital wave led by shifting consumer preferences has compelled companies to collect and analyze data to thrive. However, investing in data collection when consumers have real-time expectations from brands and companies is challenging.

Millions of pieces of data floating around in the form of applications, consumer feedback, advertising, attribution, etc make this task even tougher. So, the businesses that engage data collection services emerge as the front runners in the ever-evolving landscape. Besides, they gain advantages in terms of technology and infrastructure allowing access to insights derived from advanced analytics that deliver business impact.

Collaborating with experienced and accomplished outsourcing data collection companies can help businesses to tap the true potential of data. All the major online data collection services providing firms can easily integrate various data sources, leverage the most advanced technologies to deliver quicker and in-depth analyses, as well as extract insights that lead to better business performance. The insights derived from harnessing data assist the leaders in future-focused strategies that contribute to the growth of the organization.

Source: https://writeupcafe.com/community/data-collection-services-assistin…

Source Prolead brokers usa

variance vs standard deviation
Variance vs Standard Deviation
variance and standard deviation in statistics

Variance is one of the best measures of dispersion which measure the difference of all observation from the center value of the observations.

Population variance and standard deviation

The average of the square of the deviations taken from mean is called variance. The population variance is generally denoted by σand its estimate (sample variance) by s2. For N population values X1,X2,…,XN having the population mean μ, the population variance is defined as,
population variance formula
Where, μ is the mean of all the observations in the population and N is the total number of observations in the population. Because the operation of squaring, the variance is expressed in square units and not of the original units.

So, we can define the population standard deviation as

 

standard deviation formula

 

Thus, the standard deviation is the positive square root of the mean square deviations of the observations from their arithmetic mean. More simply, standard deviation is the positive square root of σ2.

 

Sample variance

In maximum statistical applications, we deal with a sample rather than a population. Thus, while a set of population observations yields a σ2 and a set of sample observations will yield a s2. If x1,x2,…,xn is a set of sample observations of size n, then the s2 is define as,

 

sample variance formula

Properties

Effect of changes in origin: Variance and standard deviation have certain appealing properties. Let each of the numbers x1,x2,…,xn increases or decreases by a constant c. Let y be the transformed variable defined as,

 

 

where, c is a constant.

Finally we get that any linear change in the variable x does not have any effect on its σ2. So, σ2 is independent of change of origin.

 

Effect of changes in the scale: When each observation of the variable is multiplied or divided by a certain constant c then there occur changes in the σ2.

 

scale

 

So, we can say that changes in scale affects and it depends on scale.

Uses of variance and standard deviation

A thorough understanding of the uses of standard deviation is difficult for us as this stage, unless we acquire some knowledge on some theoretical distributions in statistics. The variance and standard deviation of a population is a measure of the dispersion in the population while the variance and standard deviation of sample observations is a measure of the dispersion in the distribution constructed from the sample. It can be the best understood with reference to a normal distribution because normal distribution is completely defined by mean and standard deviation.

Source Prolead brokers usa

top 5 examples of conversational user interfaces
Top 5 Examples of Conversational User Interfaces

Introduction 

Conversational User Interface (CUI) is an artificial interface with which you can communicate to either ask questions, place orders, or get information.  

Top-notch CUI’s offer a more human-like conversation. This helps in bridging the gap between physical and online conversations.

Many companies have started understanding the importance of conversational AI by incorporating them into their marketing strategies. Statistics show that automated conversational marketing companies witnessed a 10% increase in revenue within 6-9 months.

Even from a customer’s point of view, 86% of online buyers preferred quick and immediate customer support, which chatbots for small businesses provide.

There are two main types of CUI’s. First is the chatbots where the interaction and communication takes place in the form of text. The second one is voice assistants like Google Assistant, with which you can talk to provide input.  

Game-Changing Conversational User Interface Examples 

Here are 5 of the top CUI’s and chatbots for business that cover all bases and provide a smooth and happy experience to all users.  

1. Skyscanner – Travel Search Website 

Skyscanner is an online travel agency that launched in 2003. It allows its users to compare and find cheap flights and hotels and also hire cars.  

Skyscanner is the world’s biggest independent flight search engine. In 2016, it raised $192 million to grow its engine and services. In the same year, when conversational AI and chatbots started receiving more recognition, Skyscanner joined the league by introducing their Facebook Messenger bot.

The purpose of this chatbot is to help customers search for flights to any destination through a simple conversation.  

A Brief Walkthrough  

Skyscanner’s Facebook Messenger bot begins well by providing the necessary information on its home page. By displaying information like “The world’s travel search engine” and “Typically replies instantly,” it tells you what it is capable of doing.  

When you continue, the bot welcomes you by your name, thus providing a personalized experience. You can then find flight deals, explore new destinations, or get tips on the best time and route for travelling.  

After selecting the origin city, destination city, and travel dates, the chatbot shows a list of flight options from various airlines along with their rates. It is also capable of sending alerts if there is any change in the pricing. 

Once you compare and choose a flight, the chatbot redirects you to the website to complete the payment.  

Few Brilliant Features  

  • Anywhere 

The “Anywhere” feature is one of Skyscanner’s best features. If you are unsure of your destination, simply typing “anywhere” in the text box will display a list of travel suggestions from the origin city.  

Throughout the process of searching and selecting a flight, Skyscanner’s chatbot constantly confirms the cities and dates that you have chosen. It also allows you to change the details with ease. 

Adapting to New Trends 

Skyscanner is one great example of a company that follows and adapts to new trends. With many people using the Telegram messaging service, Skyscanner introduced a Telegram bot to target a wider audience to search for flights and hotels easily. 

The bot can even understand colloquial terms like “next weekend” or “next Monday” and display the correct options.  

Skyscanner has also added a live chatbot on the Skype platform.

In 2016, Skyscanner also partnered with Amazon’s Alexa allowing users to search for flights through a voice conversation. By asking simple questions “Where are you flying from?” and “Where are you flying to?” Alexa can get the travel details from you and talk you through the relevant flight details. 

Results 

The easy-to-use conversational user interface of Skyscanner is effective in providing relevant details to all customers. In just a few years since the chatbot’s introduction, Skyscanner managed to pass one million traveller interactions with chatbots across all platforms by 2019.

All the minute details show the thought put into designing the chatbot, making it a huge success. 

2. Duolingo – Language Learning Platform 

Duolingo is a language learning platform that provides its services for free to all users on its website and mobile app. Officially released in 2012, Duolingo now offers courses in 38 languages, including fictional languages like Klingon.  

Over the past few years, Duolingo has started to leverage the power of artificial intelligence to alter the courses and make them more convenient for the user.  

With the help of a conversational user interface, Duolingo has revolutionized the language learning sector.  

The Problem 

Duolingo is an example of a great company that analyzes and understands their problems and brings out solutions to overcome them.  

Duolingo understood that the most significant problem they would face would be helping users effectively learn a language. Conversing is what helps learners practice and retain the language. Simply reading words and phrases on a screen would not help in the same way.  

The Solution 

  • Chatbots 

To overcome this obstacle, Duolingo implemented the use of AI-based chatbots. They created and assigned a few characters to the bots, allowing you to have a real conversation in your learning language.  

If you get stuck and don’t know how to reply during the conversation, you can also use the “help me reply” option to get assistance from the bots.  

Duolingo recently took conversational learning to the next level by introducing conversational lessons. This new feature offers practice with words and phrases used in real-life scenarios and will enable you to put those words together to form meaningful sentences.  

Duolingo allows you to listen and repeat commonly used sentences. It also corrects you when you speak or type the wrong word and explains its correct usage. This way, you can learn a language with Duolingo through textual and voice conversations.  

As you learn more words, the difficulty levels increase, giving you thorough learning of the entire language.   

Outcome 

Duolingo’s chatbots and conversational lessons give the user the experience of having a conversation in reality. Duolingo is known for its conversational AI and conversational marketing strategies.   

Since its inception, they have added over  500 million registered users, out of which 42 million are active every month.  

The coronavirus lockdown between March 11 – April 30 increased Duolingo’s users by 30 million people. These statistics show the magnitude of Duolingo and its CUI’s success. 

3. Domino’s – Pizza Restaurant Chain 

Domino’s is one of the most successful pizza restaurant joints across the globe. Today, Domino’s operates 17,800 stores in more than 90 countries selling an average of 3 million pizzas every day.

Over the years, Domino’s has introduced different ways through which customers can order food. One such way is online ordering.  

From 2017 to 2020 alone, Domino’s made 27 million Facebook impressions. This figure alone shows the success of online ordering. But Domino’s did not stop there. They introduced CUI into their business, allowing customers to order food through a bot on Facebook Messenger.

Here are some highlights of Domino’s chatbot for business. 

Meet Dom  

Domino’s named their chatbot “Dom,” giving it a character. This makes the user feel that they are conversing with a person on the other end rather than a computer. Dom makes digital ordering more conversational and simple.  

Pre-set Options 

Dom has pre-set default options programmed into its interface. So, when you want to place an order with Dom, options like “Pizza,” “Pasta,” “Sandwiches,” etc., show up on the screen. All you have to do is select an option and continue to the next step. This eliminates the need to type in your order, thus saving time.  

Dom is also aware of current deals and allows you to apply a deal or coupon to your order.  

Constant Summarizing  

The entire process of ordering a pizza occurs in multiple steps. It includes choosing the size of the pizza, crust, and toppings. Dom makes sure that it constantly summarizes your order while simultaneously adding new information to it at every step.  

Dom also simplifies the process of making changes to the order. Even if you are in the last step (say you are choosing toppings) and feel like changing the pizza crust, Dom will make that change for you while retaining other information (like pizza size) in your order.  

Accepting its Shortcomings  

When Dom is unable to understand the customer’s input, it apologizes and lets the customer know about it. This gesture is appreciated rather than displaying information that is not related to the customer’s request.

Domino’s Voice Ordering 

Dom’s skills also include its ability to place orders through voice commands from users, making pizza ordering easier.  

Domino’s also offering its services on voice-based CUI’s like Amazon Alexa, launched in 2017, and Google Assistant, launched in 2019. Through these mediums, you can place your most recent order or track your ongoing order by asking the voice bot to do so.  

Domino’s Anyware 

Apart from ordering through chatbots and voice-based CUI’s, the Domino’s Anyware initiative allows all users to literally order from anywhere. This includes ordering from your car, smart TV, smartwatch, and through tweets, SMS, and zero-click app.  

Interface 

To put it in a nutshell, Domino’s conversational AI chatbot makes online pizza ordering simple for all customers. The linear flow in Dom’s CUI makes it easy to order food when compared to other alternatives.  

4. Lark – Digital Healthcare Platform 

Lark is a digital healthcare company that offers services in various sectors. It keeps track of your daily activities like food habits and sleeping patterns and aims at improving your fitness and health. It helps people in reducing weight and also focuses on reducing stress and anxiety among people.  

Lark’s chatbot is an app that dedicates itself to all these activities. Users can interact with their bot through text, voice, and button options.

Varied Responses 

One aspect that sets a fundamental difference between ordinary bots and top chatbots like Lark is its varied responses to the same topic. Even if you type in the same sentence repeatedly, Lark will respond with a different answer. This small attribute enormously improves its human-like conversational style.  

Lark’s responses are also friendly and caring. This is crucial, especially for conversations about mental health and stress. These responses help in motivating the users.  

A Knowledgeable Bot 

While conversing with a healthcare bot, knowledge about everything must be its top priority. Lark is one such bot that knows stuff related to its field as it was created with the help of experts and professionals in the healthcare sector.  

For instance, when you tell Lark what you ate for lunch, it can recognize it and place it under a particular category (like veggies or meat). It can then make recommendations (like switching to other categories) so that you consume all kinds of nutrients to maintain a balanced diet.  

In a way, Lark acts as your fitness coach and nutritionist.   

Proven Research 

A comprehensive study was performed on the Lark Weight Loss Health Coach AI (HCAI) to evaluate its effectiveness in weight loss. The results of the study showed the following:

  • It increased the consumption of healthy meals by 31%.

  • An in-app survey showed a 100% response rate.

  • High-risk diabetes patients using conversational AI lost a magnitude of weight compared to the loss achieved with lifestyle change programs.

  • The health coach also encouraged positive behavioral changes.  

Achievements 

Over the years, Lark and its conversational user interface have received a few achievements. 

5. Erica – Bank of America’s CUI 

In 2018, Bank of America launched their own chatbot “Erica” to help their customers in their transactions on the mobile app. 

Created using AI, predictive analytics, and cognitive messaging, Erica can help customers in numerous ways like, 

  • Making payments 

  • Checking account balance 

  • Tracking daily expenditure 

  • Locating past transactions 

  • Checking FICO score 

  • Receiving notifications on pending bills 

Highlights of Erica’s Conversational User Interface 

The home page of the app displays a greeting message that welcomes the user. Through the prompt at the bottom of the page, you can type or voice out your task or query. Erica also displays a message, “See what Erica can do,” which shows all its functions when clicked upon. 

  •  Versatility 

Erica can efficiently understand voice, text, as well as tap inputs from the users. Erica indeed shows its versatility when it comes to understanding the customers’ varied questions. Currently, Erica can understand almost 500,000 different variations of the questions that customers ask.

Erica’s time-to-resolution averages around three minutes only via voice within the app. The voice-first attitude of Erica has redefined banking, taking it to a whole new level.   

Erica provides meaningful insights that help customers in making better decisions. These insights can also help in saving money. Erica can do this by suggesting stuff like putting the cash rewards from your credit cards to better usage.  

Instead of asking detailed questions or sending out long forms, Erica asks for feedback subtly. Once the tasks are completed, a smiley and a sad emoji appear. You can easily give feedback by tapping on any one of them.  

Analysis of Erica’s Success 

Around 500,000 new users make use of Erica’s services every month. At the end of 2019, Bank of America stated that Erica alone had witnessed over 10 million users and was about to complete 100 million client requests and transactions.

These statistics show that Bank of America hit the bulls-eye with its conversational AI. 

Why is Conversational User Interface Important? 

Using Artificial Intelligence (AI) and Natural Language Processing (NLP), CUI’s can understand what the user wants and provide solutions to their requests. 

Some of the best CUI’s provide the following benefits to the customer and the owner. 

  • Provide a personalized and unique experience to all the users 

  • Offer 24×7 support to all customers and clients, eliminating the need for man-power for the same job 

  • Can efficiently speak to thousands of customers at the same time 

  • Convenient to use as they are user-friendly 

  • Available across various platforms and channels  

  • Easy to set up for business owners as it requires little to no knowledge in coding 

  • Real-time analytics help business owners and marketers improve their marketing strategies. 

Conclusion 

Going through these game-changing conversational user interface and chatbots for business, it is clear that using them in conversational marketing strategies increases a company’s sales and leads to more happy customers.  

Most of these chatbots also prove that thinking about all the small and minute details and incorporating them in the CUI’s can take the company a long way forward.  

This article was originally published on WotNot.

Source Prolead brokers usa

top robotic process automation frameworks in 2021
Top Robotic Process Automation Frameworks in 2021

This post will discuss Robotic Process Automation, Why RPA is needed, and the top Robotic Process Automation frameworks that every business owner must rely on in 2021 and even after. 

Let’s start understanding it one by one.

Why Robotic Process Automation? 

As per Wikipedia,

Robotic process automation (or RPA) is a form of business process automation technology based on metaphorical software robots (bots) or artificial intelligence (AI)/digital workers. Therefore, it is sometimes referred to as software robotics (not to be confused with robot software).

Source 

Also, Gartner is forecasting:

RPA revenue will reach close to $2 billion this year and will continue to rise at double-digit rates beyond 2024.

Source

That makes hiring RPA developers essential for today’s business for automating tasks, which is ultimately required to enhance performance, speed, and productivity. 

Let’s look into some reasons why RPA is vital for process automation:

  • Increases Productivity, Speed, and Quality 

Robotic Process Automation can easily be trained to understand repetitive chores faster and much more efficiently than humans can ever do. 

  • Squeeze Out More Value From Gigantic Data 

Almost every business/organization needs to store gigantic data nowadays. And, the data is so much so that companies can’t even process all of it. RPA is suited to help parse through vast amounts of data and assist businesses in making sense of every data they collect. 

Source 

  • Spare Enough Time for Employees to Be More Productive 

RPA helps in freeing up the employees for doing more valuable tasks in the business. It offers workers the ability to work on some more critical tasks, which can easily enhance productivity, quality, and speed of the processes in the business. Since employees will be more excited about their jobs now, it will be much easier to keep them happy. 

  • Be Much More Adaptable to Change 

Organizations are much more agile in adapting to change since the disruption caused by COVID-19. This is because the rate of being adaptable is relatively high, and RPA helps organizations speed up the processes at a minimum cost to be more efficient in such unwanted situations. As a result, organizations that follow RPA are more likely to deal with such change and disruption than those that don’t. 

Source 

Top Robotic Process Automation Frameworks to Try in 2021

Now that we know about RPA and why it is essential for any business, let’s check out the top five Robotic Process Automation frameworks to help enterprises to be more productive, quicker, and efficient in delivering quality in 2021 and even after. 

Earlier known as sharpRPA, Taskt is a free C# program built using the .NET Framework. The best part about Taskt is that it features an easy-to-use drag and drop interface, which leads to a simplified automation process without needing to code. As a result, Taskt is a fantastic tool for teams who are purely C# centric. 

Developers with strong Azure/Microsoft background will find it much easier to create scripts with Taskt using C#. Hence, Taskt can be an excellent tool for anyone, especially for those used to developing Microsoft C# solutions. 

The capabilities of Taskt opens up a whole world of possibilities for businesses – they can quickly re-engineer the traditional business processes just by using the simple drag and drop interface. You even have the option to opt for free trials of the app or even set up manually – the choice is yours. 

Why use Taskt? 

  • Free to set up and use
  • Time and cost savior 
  • No more handling of complex accounting tasks 
  • Seamlessly manage accelerating volumes of incoming data

Source

A multilayered and sophisticated tool that incorporates rich scripting language, allowing developers to complete complicated RPA instructions. Once each set of instructions, known as ‘Flows,’ is developed, you can easily save it in a text file with the extension ‘.tag’ using TagUI’s scripting language. Each flow can be efficiently executed then using a terminal window/command prompt. 

Every flow script can quickly identify the following:

  • Instructions to open an app/website
  • Where exactly to click on the screen 
  • What sort of content to type 
  • ‘IF’ and ‘Loop’ instructions 

The TagUI’s scripting language is quite rich, and that’s why people love to rely on this robotic process automation framework. Moreover, once the tool is up and running, it is pretty seamless to share the scripts as .tag files to form the library. 

When your team wants to experience automation while enjoying high-end customizations, it is highly recommended to use Open RPA. It is a mature tool all set to support and scale companies of every size. 

Open RPA tool features:

  • Seamless integration with leading cloud providers
  • Remote Management
  • Easy remote handling of state 
  • Easy-to-access dashboard for analytics 
  • Scheduling 

Open RPA is one of the two projects by OpenIAP, where IAP stands for Integrated Automation Platforms. The best part about Open RPA is that it is pretty easy to start with, and you don’t have to be a whiz to utilize it. You can completely automate your data and get access to real-time reporting to enhance the organization’s productivity. 

Source

  • Robot Framework 

The gigantic community of open-source developers has undeniably made it highly reliable and trustworthy RPA solutions for developers. Evidently, the benefits of the Robot Framework have made it such a highly preferred robotic process framework for the developers, which can be stated as follows:

  • Robot Framework runs on different platforms, making it one of the most easy-to-use platforms to adopt and implement. 
  • The core framework of the Robot Framework can be extended using the massive library of plugins. 
  • Group of vendors supports the open-source community, which helps in the quick update of the core product. 
  • Easily scale to the business’ needs using the default bots for replicating the automation. 

Expert developers love this Robotic Process Automation platform, as this tool is a little complicated and may not be highly recommended to those new to RPA. 

  • UI.Vision (Kantu)

Earlier, it was known as Kantu and runs either as a plugin in your web browser or as a standalone client. You don’t have to be an expert in writing scripts since a point-and-click interface drives that. UI.Vision is a highly reliable Robotic Process Automation framework for even those new to RPA who don’t have access to unlimited resources. 

UI.Vision is an excellent Robotic Process Automation tool for developers. However, it may lack the functionality needed to complete more complicated tasks. For example, more complex controls need terminal window access and scripts, which are not really supported by UI.Vision. 

Source

The Final Thoughts 

Robotic Process Automation frameworks are friendly when you want your businesses to be more productive, quick, highly automated, more efficient, and deliver quality. But, of course, that being the ultimate goal of every organization, every business would benefit from it. So, we highly recommend you adopt RPA into your firm, experience the benefits of automation, and make your business/employees/processes more efficient and productive than ever before. 

Source Prolead brokers usa

cx transformation with ai chatbots is not a one size fit all approach
CX Transformation with AI Chatbots is not a “One-Size-Fit All” Approach

Businesses now realize the need for a customer-centric approach to transforming their customer experience (CX). According to the Zendesk Customer Experience Trends Report 2021, 75 percent of company leaders agreed that the global pandemic accelerated the acquisition of new technologies to get customer-centricity right.

But, there are challenges too.

  • Some of the businesses don’t have the systems and technology to segment and profile customers. 
  • Some lack the processes and operational capabilities.
  • Some of them don’t have all of the components in place to claim they are customer-centric. 
  • Few don’t know what their customers expect and how they want to interact with the business – not the products, features, or revenue model. 

However, in the digital-first world, social messaging is the dominating communication channel consumers are using to interact with brands. Forward-looking businesses are tapping this trend to their advantage using industry-ready AI chatbots to manage customer-centric interactions and forge customer relationships online.

Superior Customer Experience is a Necessity

While adopting the latest AI technologies to improve customer relationships, it becomes imperative for industry leaders to keep an eye on the latest customer engagement trends. Here are a few reasons that explain why a top-notch customer experience is the need of the hour: 

  • To enable a Superior Omnichannel Experience

AI-powered chatbots are capable of preserving information across several digital touchpoints and even when it transfers the conversation to a live agent, customers don’t have to explain their issues repetitively. Such availability of information across the channels helps businesses provide a consistent omnichannel experience to their customers. This experience helps businesses save time for customers and amplifies the customer engagement graph.

  • To improve Brand Loyalty & Differentiation

Another success metric for businesses is to consistently improve their brand value in this digital competitive arena. Brand loyalty involves an intrinsic commitment of a consumer to a brand based on the distinctive values it offers. Hence, it becomes an obvious reason for CXOs to leverage a Conversational AI technology that enables instant, relevant responses helping brands provide improved experiences and differentiation.

  • To expand new Customer Base

The biggest success for brands is to acquire new customers and expand their customer base over time. Providing instant prompts with offers, product recommendations, and guiding customers through their conversational journeys enables businesses to broaden their reachability and increase conversions.

Intelligent AI chatbots are fast becoming key enablers to customer support and conversational commerce teams and are instrumental to improving the end-customer experience landscape.

Why are AI Chatbots not a “One-Size-Fit-All” Approach?

AI Chatbots are not a “one-size-fits-all” solution. No two brands have the same business needs, so no two chatbots can be the same. An all-in-one solution that goes right for all the business functions sounds like a myth. Hence, the approach has to be changed as per the business use cases while building and training an AI chatbot. 

When catering to customer support and conversational commerce use-case, the “One-size Fit-all” approach is not able to solve all customer queries. The responses will sound generic to customers and increase dissatisfaction. Hence, the right approach is to replace this with the best and most common industry use-cases to improve efficiency and conversions.  

Here are a few problems that remain unsolved with the one-size-fit-all approach:

  • Every industry has its distinct use-cases. Today, every industry has its unique business use cases depending on the marketplace and audience they are targeting. Hence, a versatile approach that provides solutions to industry-specific use cases should be the topmost priority for businesses when adopting customer experience automation technology.
  • Non-personalized responses don’t work anymore.  A generic AI Chatbot will not be capable of providing contextual responses across omnichannel digital touchpoints. In the current landscape, this won’t work anymore. The need of the hour is a domain-intelligent AI Chatbot that can end-to-end resolve customer queries, providing a seamless experience to the customer.
  • Customer satisfaction matters. Unhappy Customers are an outcome of poor customer service. A generic AI chatbot will not be able to deliver top-quality support and service as they are not supported or trained to handle domain-specific commonly recurring queries, resulting in increasing customer dissatisfaction.

NLP: The Technology Behind Intelligent Conversations

While it is established that a domain-specific, AI virtual assistant is core to enabling superior customer experience, it’s important to understand the technology behind it.

To understand the pain points, intent, and expectations of a customer in a conversation between a bot and a customer, NLP is the behind-the-scenes technology that makes the magic happen.

Natural Language Processing (NLP) is a subsection of Artificial Intelligence that enables chatbots to understand human languages. NLP analyzes the customer query, language, tone, intent, etc., and then uses algorithms to deliver the correct response. In other words, it interprets human language so efficiently that it can automatically perform end-to-end interaction with accuracy.

Key Capabilities that NLP provides: 

  • NLP allows chatbots to understand voice input as well as text. 
  • With NLP technology, the chatbot doesn’t need the exact correct syntax to understand customer’s expectations. 
  • Based on its programming mechanism, it can auto-detect languages, context, sentiment, and intent. 
  • Chatbots can either process their response through their NLP engine or by analyzing customer’s browser preferences.

Intelligent AI chatbots are now critical to strengthening a brand’s CX strategies. As cognitive AI-powered technologies continue to develop, business leaders must ensure they adopt chatbots technologies that are agile to meet the requirements of their businesses.

Key Capabilities a powerful AI Chabot Should Have

An AI-powered full-stack Conversational AI platform enables brands to comprehensively solve business problems end-to-end, and at scale. While looking to adopt a conversational AI solution, some of the key characteristics which CX leaders should look for are as follows:

  • Powerful NLU & ML Intelligence: The turning point in the evolution of chatbots was the advent of two key AI technologies – Natural Language Understanding (NLU) and Machine Learning (ML). The architecture of Natural Language Understanding (NLU) is built on a combination of modules such as Language detection, ASR classification, Context Manager, that work in tandem with deep learning-based encoders to accurately understand natural language and handle user queries with higher precision. Businesses should go with a Conversational AI solution that has a high precision, powerful NLU capability.
  • Ability to Create Domain Intelligent Conversations: Industry-specific AI chatbots embedded with domain-specific intelligence, data dictionaries & taxonomies are trained on thousands of user utterances to deliver human-like conversational experiences at scale. The in-built Named Entity Recognition (NER) engine helps chatbots to understand user intent and context better. As customer conversations are unique to a business, the Conversational AI solution must be agile and help create domain intelligent conversations.
  • Quick to Launch: AI chatbots built using smart NLU and advanced domain intelligence capabilities Smart Skills deliver desired output with minimal effort and training. This platform consists of a comprehensive library of 100+ ready-to-use, domain-specific intelligent use cases for your business. Technology is getting easier to deploy and domain-intelligent chatbots can now be launched in a matter of minutes. Businesses should go for a Conversational AI solution that is faster to value and give quick ROI.
  • Comprehensive integration to build a Full Stack solution: An AI solution that can be easily integrated into your existing CRMs, help desk software, etc helps create a full-stack solution with only one source of truth. The best scenario, in this case, is that the integration of these AI solutions should not require deep-coding dependencies or complex technical processes. Businesses should adopt an easy-to-integrate Conversational AI solution that has a comprehensive integration ecosystem.

CX Trends to Look out for in 2021

While the above-mentioned capabilities of Conversational AI sound interesting and intriguing, it is only the tip of the iceberg. Technology has just entered the digital space and is expected to evolve further with time. Talking about the same, here are the top four customer experience trends businesses might come across in 2021 and beyond. 

  • The build-to-buy switch: Considering the increasing popularity, organizations find it optimal in terms of cost to purchase already-built tools and then customize them, instead of building one from scratch.
  • Emphasize on what and how of the customers: 2021 conversational AI tools are more efficient now. They are designed to understand human language quicker, faster and give human-like responses.
  • Deploy models (process-oriented) that are more than a messaging bot: Since organizations are on the lookout for automating a large part of their customer interaction funnel, emphasis is laid on the creation of tools that are one step ahead of the basic designs and can automate end-to-end queries and processes which are repetitive.
  • Consolidation of customer support, marketing, and sales departments: To offer an omnichannel experience, the next wave of conversational bots is bringing together the different departments in an organization to achieve a common goal of customer experience.

Final Words

CX transformation is a catch-all phrase that means something different for every business. There should be different strategic approaches when it comes to deploying AI-powered technologies. However, it is established that a simple AI chatbot will not deliver the kinds of experiences that a Conversational AI solution can enable

In case you’re interested to explore more, here’s an eBook we’ve put together that shares the experiences of a diverse set of CxOs as a part of their journey to identify feasible, realistic solutions to solve the challenge of repairing a broken customer experience and scaling high-volume customer queries with AI Automation. Get your copy here.

Join us in our journey to transform Customer Experience with the power of Conversational AI.

Source Prolead brokers usa

covid 19 fundamental statistics that are ignored
Covid-19: Fundamental Statistics that are Ignored

This is not a discussion as to whether the data is flawed or not, or whether we are comparing apples to oranges or not (the way statistics are gathered in different countries). These are of course fundamental questions, but here I will only use data (provided by Google) that everyone seem to more or less agree with, and I am not questioning it here.

The discussion is about why some of that data makes the news every day, while some other critical parts of that same public data set is nowhere mentioned. I will focus here on data from United Kingdom, which epitomizes the trend that all media outlets cover on a daily basis: a new spike in Covid infections. It is less pronounced in most other countries, though it could take the same path in the future. 

The three charts below summarize the situation. But only the first chart is discussed at length. Look at these three charts, and see if you can find the big elephant in the room. If you do, no need to read the remaining of my article! The data comes from this source.  You can do the same research for any country that provides reliable data.

Of course, what nobody talks about is the low ratio of hospitalizations per case, which is significantly down by an order of magnitude, compared to previous waves. Even lower is the number of deaths per case. Clearly, hospitalizations are up, so there is really some worsening taking place. And deaths take 2 to 3 weeks to show up in the data. This is why I selected United Kingdom, as the new wave started a while back yet deaths are not materializing (thankfully!)

This brings a number of questions:

  • Are more people getting tested because they are flying again around the world and vacationing, or asked to get tested by their employer?
  • Are vaccinated people testing positive but don’t get sick besides 24 hours of feeling unwell just after vaccination?
  • Are people who recovered from Covid testing positive again, but like vaccinated people, experience a milder case, possibly explaining the small death rate?

It is argued that 99% of those hospitalized today are unvaccinated. Among the hospitalized, how many are getting Covid for the first time? How many are getting Covid for the second time? Maybe the latter group behaves like vaccinated people, that is, very few need medical assistance. And overall, what proportion of the population is either vaccinated or recovered (or both)? At some point, most of the unvaccinated who haven’t been infected yet will catch the virus. But no one seems to know what proportion of the population fits in that category. At least I don’t. All I know is that I am not vaccinated but have recovered from Covid once if not twice. 

To receive a weekly digest of our new articles, subscribe to our newsletter, here.

About the author:  Vincent Granville is a data science pioneer, mathematician, book author (Wiley), patent owner, former post-doc at Cambridge University, former VC-funded executive, with 20+ years of corporate experience including CNET, NBC, Visa, Wells Fargo, Microsoft, eBay. Vincent is also self-publisher at DataShaping.com, and founded and co-founded a few start-ups, including one with a successful exit (Data Science Central acquired by Tech Target). He recently opened Paris Restaurant, in Anacortes. You can access Vincent’s articles and books, here.

Source Prolead brokers usa

of superheroes hypergraphs and the intricacy of roles
Of Superheroes, Hypergraphs and the Intricacy of Roles

In my previous post in which I discussed names, I also led in with the fact that I am a writer. Significantly, I did not really talk much about that particular assertion, because is in fact comes with its own rabbit hole quite apart from that associated with names and naming. Specifically, this assertion is all about roles.

Ontologists, especially neophyte data modelers, often get caught up in the definition of classes, wanting to treat everything as a class. However, there are two types of things that don’t actually fit cleanly into traditional distinctions of class: roles and categorizations. I’m not going to keep the focus of this article on roles, preferring to treat categorizations separately, though they have a lot of overlap.

In describing a person, it’s worth dividing this particular task up into those things that describe the physical body and those things that describe what that person does. The first can be thought of as characteristics: height, weight, date of birth (and possibly death), skin color, physical gender, scars and marks, hair color, eye color, and so forth. Note that all of these things change over time, so blocks of characteristics may all be described at given intervals, with some kind of timestamp indicator of when these characteristics were assessed.

In a purely relational database, these personal relationships would be grouped together in a cluster of related content, (a separate table row) with each row having its own specific timestamps. The foreign key for the row would be a reference to the person in question.

In a semantic triple store (a hypergraph), this relationship gets turned around a bit. A block of characteristics describes the individual – the “parent” of this block is that person. In UML, this would be considered a composition but SQL can’t in fact differentiate between a composition (a characteristic that is intrinsic to its parent) and an association (a relationship between two distinct entities). That’s why SQL treats both as second normal form relationships.

A semantic graph, on the other hand, is actually what’s called a hypergraph. This is a point that a surprising number of people even in the semantic world don’t realize. In a normal, directed graph, if you have an assertion where the subject and the predicate are the same, then you can have only one object for that pair (keeping in mind that the arrow of the edge in the directed graph always goes from subject to object, not the other way around). This is in fact exactly what is described in SQL – you can’t have the same property for a given row point to different foreign keys.

In a hypergraph, on the other hand, there is no such limit – the same subject and predicate can point to multiple objects without a problem. This means that you CAN in fact differentiate a composition from an association in the graph, rather than through an arbitrary convention. The downside to this, however, is that while all graphs are hypergraphs, not all hypergraphs are graphs. in the strictest sense of the word. Put another way, the moment that you have a property for a given subject point to more than one object, you cannot represent that hypergraph in a relational database.

Ironically, this is one of the reasons that compositions tend to be underutilized in programming – they are a pain to serialize in SQL. Compositions are easier to manage in both XML and JSON, but in the case of JSON, this is only because properties can take arrays as arguments (XML uses sequences and is actually much better oriented for working with them). An array is a data structure – which means that an array with a single value is NOT the same thing structurally as a singleton entity, which can be a real problem when dealing with serialized JSON versions of RDF.

Given all that, the reason that hypergraphs are so relevant to roles is that a role is intrinsically a hypergraph-type relationship. For instance, consider a given person. That person may be simultaneously a novelist and a screenwriter (and could even be a producer (in the US, the UK still uses show runner for the person who is largely responsible for the overall story of several episodes of a show). Just to make things even more exciting, it’s even possible for that same person to be an actor at the same time as well.

What makes roles so problematic is that they are not just descriptive. A screenwriter writes a draft of a story for a production. An author writes a book. They may describe the same basic story, but beyond the essence of the story, the two may be dramatically different works (J.R.R. Tolkien wrote the Lord of the Rings series, but there have been multiple adaptations of these books by different screenwriters over the years). This means that the information that a role provides will likely vary from role to role.

In point of fact, a role is an example of a more abstract construct which I like to call a binding. All bindings have a beginning and an end, and they typically connect two or more entities together. Contracts are examples of bindings, but so are marriages, job roles, and similar entities. In the case of a script binding, it would look something like this (using the Templeton notation I discussed in my previous article):

#Templeton

?Script a Class:_Script;

       Script:hasAuthor ?Person; #+ (<=entity:hasPerson)

       Script:hasProduction ?Production; #

       Script:hasWorkingTitle ?Title. #xsd:string? (<=skos:PrefLabel)

       binding:hasBeginDate ?BeginDate; #xsd:dateTime

       binding:hasEndDate ?EndDate; #xsd:dateTime?

       binding:hasVersion ?version. #xsd:string

       .

Where is the role here? It’s actually implied and contextual. For instance, relative to a given production you can determine all of the scriptwriters after the fact:

#Sparql

construct {

    ?Production production:hasScriptWriter ?Person.

    }

where {

    ?Script Script:hasProduction ?Production.

    ?Script Script:hasAuthor ?Person.

    ?Person Person:hasPersonalName ?PersonalName.

    ?PersonalName personalName:hasSortName ?SortName

    }

order by ?SortName

In other words, a role, semantically, is generally surfaced, rather than explicitly stated. This makes sense, of course: over time, a production may have any number of script writers. One of the biggest areas of mistakes that data modelers make usually tends to be in forgetting that anything that has a temporal component should be treated as having more than one value. This is especially true with people who came to modeling through SQL, because SQL is not a hypergraph and as such can only have many-to-one relationships, not one-to-many relations.

By the way, there’s one assertion format in Templeton I didn’t cover in the previous article:

script:hasWorkingTitle ?Title. #xsd:string? (<=skos:PrefLabel)

The expression (<=skos:prefLabel) means that the indicated predicate script:hasWorkingTitle should be seen as a subproperty of skos:prefLabel. More subtly,

script:hasAuthor ?Person; #+ (<=entity:hasPerson)

indicates that the predicate script:hasAuthor is a subproperty of entity:hasPerson.

Note that this is a way of getting around a seemingly pesky problem. The statement:

?Script script:hasAuthor ?Person; # (entity:hasPerson)

is a statement that indicates that there is an implicit property entity:hasPerson that script:hasAuthor is a sub-property of, and by extension, that Class:_Author is a subclass of Class:_Person. However, with the binding approach, there is no such explicit formal class as Class:_Author. We don’t need it ! We can surface it by inference, but given that a person can be a scriptwriter for multiple productions and a production can have multiple scriptwriters, ultimately, what we think of as role is usually just a transaction between two distinct kinds of entities.

While discussing media, this odd nature of roles can be seen in discussion of characters. I personally love to discuss comic book characters to illustrate the complex nature of roles, because, despite the seeming childishness of the topic, characters are among some of the hardest relationships to model well. The reason for this is that superheroes and supervillains are characters in stories, and periodically those stories are redefined to keep the characters seemingly relevant over time.

I find it useful in modeling to recognize that there is a distinction between a Person and what I refer to as a Mantle. A person, within a given narrative universe, is born, lives a life, and ultimately dies, A mantle is a role that the person takes on, with the possibility that, within that narrative universe, multiple people may take on the mantle over time. For instance, the character of Iron Man was passed from one person to another.

Notice here that I’ve slipped in the concept of narrative universe here. A narrative universe, or Continuity for brevity, is a story telling device that says that people within a given continuity have a consistent history. In comics, this device became increasingly used to help to deal with inconsistencies that would emerge over time between different writing teams and books. The character Tony Stark from Marvel comics featured the same “person” in different continuities, though each had their own distinct back stories and histories.

I use both Tony Stark and Iron Man here because Tony Stark wore the mantle of Iron Man in most continuities, but not all Iron Man characters were Tony Stark. When data modeling, its important to look for edge cases like this because they can often point to weaknesses in the model.

Note also that while characters are tied into the media of the story they are from, a distinction needs to be made between between the same character in multiple continuities and the character at different ages (temporal points) within the same continuity. Obi Wan Kenobi was played by multiple actors over several different presentations, but there are only a small number of continuities in the Star Wars Universe, with the canonical continuity remaining remarkably stable.

Finally comes the complications due to different actors performing different aspects of the same characters in productions. James Earl Jones and David Prowse performed the voice and movement respectively of Anakin Skywalker wearing the mantle of Darth Vader in the first three movies, with Hayden Christensen playing Anakin before he donned the cape and helmet, Jake Lloyd played him as a child, and Sebastian Shaw played him as the dying Skywalker in Return of the Jedi. If you define a character as being the analog of a person in a narrative world, then the true relationship can get to be complex:

#Templeton

# A single consistent timeline or universe.

?Continuity a Class:_Continuity.

# An self-motivated entity that may live within multiple continua, aka Tony Stark

?Character a Class:_Character;

     Character:hasPersonalName ?PersonalName; #+

     Character:hasStartDate ?CharacterStartDate; #ContinuityDate

     Character:hasEndDate ?CharacterEndDate; #?

     .

# A role that a character may act under, aka Tony Stark

?Mantle a Class:_Mantle;

     Mantle:hasName ?MantleName;

     .

# A character from a given continuum acting under a given mantle

?CharacterVariant a Class:_CharacterVariant;

     CharacterVariant:hasCharacter ?Character;

     CharacterVariant:hasMantle ?Mantle;

     CharacterVariant:hasContinuity ?Continuity;

     CharacterVariant:hasStartDate ?CharacterMantleStartDate; #ContinuityDate

     CharacterVariant:hasEndDate ?CharacterMantleEndDate; #ContinuityDate?

     .

# A single sequential narrative involving multiple characters within a continuity.

?StoryArc a Class:_StoryArc;

     StoryArc:hasContinuity ?Continuity;

     StoryArc:hasNarrative ?Narrative; #xsd:string

     StoryArc:hasStartDate ?ArcStartDate; #ContinuityDate

     StoryArc:hasEndDate ?ArcEndDate; #ContinuityDate

     .

# A block of characteristics that describes what a given character can do within a story arc.

?Characteristics a Class:_Characteristics;

     Characteristics:hasCharacterVariant ?CharacterVariant;

     Characteristics:hasStoryArc ?StoryArc;

     Characteristics:hasPower ?Power; #*

     Characteristics:hasWeakness ?Weakness; #*

     Characteristics:hasNarrative ?Narrative; #xsd:string

     Characteristics:hasAlignment ?Alignment;

     .

#

?ActorRole a Class:_ActorRole;

     ActorRole:hasActor ?Person;

     ActorRole:hasCharacterVariant ?CharacterVariant;

     ActorRole:hasStoryArc ?StoryArc;

     ActorRole:hasType ?ActorRoleType;

     .

?Production a Class:_Production;

     Production:hasTitle ?ProductionTitle;

     Production:hasStoryArc (?StoryArc+);

     .

This is a complex definition, and it is also still somewhat skeletal (I haven’t even begun digging into organizational associations yet, though that’s coming to a theatre near you soon). It can be broken down into a graphviz (which can actually be created fairly handily from Templeton) as something like the following:

digraph G {

    node [fontsize=”11″,fontname=”Helvetica”];

    edge [fontsize=”10″,fontname=”Helvetica”];

    Continuity [label=”Continuity”];

    Character [label=”Character”];

    Mantle  [label=”Mantle”];

    CharacterVariant  [label=”CharacterVariant”];

    StoryArc  [label=”StoryArc”];

    Characteristics  [label=”Characteristics”];

    ActorRole [label=”ActorRole”];

    Production [label=”Production”];

    Character -> Continuity [label=”hasContinuity”];

    CharacterVariant -> Character [label=”hasCharacter”];

    CharacterVariant -> Mantle [label=”hasMantle”];

    StoryArc -> Continuity [label=”hasContinuity”];

    Characteristics -> CharacterVariant [label=”hasCharacterVariant”];

    ActorRole -> Person [label=”hasActor”];

    ActorRole -> CharacterVariant [label=”hasCharacterVariant”];

    ActorRole -> StoryArc [label=”hasStoryArc”];

    ActorRole -> ActorRoleType [label=”hasType”];

    Production -> StoryArc [label=”hasStoryArc”];

}

and rendered as the following:

We can use this as a template to discuss Tony Stark and Iron Man in the Marvel Cinematic Universe (MCU):

This may seem like modeling overkill, and in many cases, it probably is (especially if the people being modeled are limited to only one particular continuum). However, the lessons to be learned here are that this is not THAT much overkill. People move about from one organization to the next, take on jobs (and hence roles) and shed them, and especially if what you are building is a knowledge base, one of the central questions that you have to ask when working out modeling is “What will the world look like for this individual in three years, or ten years?” The biggest reason that knowledge graphs fail is not because they are over-modeled, but because they are hideously under-modeled, primarily because building knowledge graphs means looking beyond the moment.

What’s more important, if we take out the continuum and the actor/production connections, this model actually collapses very nicely into a simple job role model, which is another good test of a model. A good model should gracefully collapse to a simpler form when things like continuum are held constant. The reason that Characteristics in the above model is treated as a separate entity is because the characteristics of a person changes with time. Note that this characteristic model ties into the character (or person) rather than the role, though it’s also possible to assign characteristic attributes that are specific to the role itself (the President of the United States can sign executive orders, for instance, but once out of that role, the former president cannot do the same thing).

I have recently finished watching the first Season of Disney/Marvel’s Loki. Beyond being a fun series to watch, Loki is a fantastic deconstruction of the whole notion of characters, mantles, variants and roles within intellectual works. It also gets into ideas about whether, in fact, we live in a multiverse (The Many Worlds Theory) or instead whether quantum mechanics implies that the universe simply creates properties when it needs to, but the cost of these properties is quantum weirdness.

Next up in the queue – organizations, and modeling instances vs sets.

Source Prolead brokers usa

e r diagram 5 mistakes to avoid
E-R Diagram: 5 Mistakes to Avoid
  • Good E-R diagrams capture core components of an enterprise.
  • 5 things not to include in your diagram.
  • Tips to guide you in your E-R diagram design.

In my previous blog post, I introduced you to the basics of E-R diagram creation. In this week’s post, I show you how not to make one. In addition to creating a clean, readable diagram, you should avoid any of the following poor practices:

  1. Saying the same things twice with redundant representations.
  2. Using arrows as connectors (unless it’s indicating a cardinality of “one”). 
  3. Overusing composite, multivalued, and derived attributes.
  4. Including weak entity sets when a strong one will suffice.
  5. Connecting relationships to each other.

1. Don’t include redundant representations.

Redundancy is when you say the same thing twice. As well as wasting space and creating clutter, it also encourages inconsistency. For example, the following diagram states the manufacturer of “wine” twice: once as a related entity and once as an attribute.

If you include two (or more) instances of the same fact like this, it could create problems. For example, you may need to change the manufacturer in the future. If you forget to change both instances, you’ve just created a problem. Even if you remember to change both, who’s to say you (or someone else) didn’t add a third or fourth? Stick to one representation per fact and avoid trouble down the road.

2. Don’t Use Arrows as Connectors

Arrows have a very specific meaning in E-R diagrams: they indicate cardinality constraints. Specifically, a directed line (→) indicates “one” with an undirected line (-) signifies “many” [2]. The following E-R diagram (C) shows an example of when you should use an arrow. A customer has a maximum of one loan via the relationship borrower. In addition, each loan is associated with a single customer via the same borrower relationship. Diagram (D) on the other hand shows that the customer may have several loans and each loan may be associated with multiple borrowers.

It’s possible to place more than one arrow from ternary (or greater) relationships. However, as these can be interpreted in multiple ways, it’s best to stick to one arrow. 

3. Don’t Overuse Composite, Multivalued, and Derived Attributes

Although you have many different elements to choose from in a diagram, that doesn’t mean you should use all of them. In general, try to avoid composite, multivalued and derived attributes [2]. These will quickly clutter up your diagram. Consider the following two E-R diagrams.

The first (A) shows some basic customer information. Diagram (B) shows the same information with the addition of many unnecessary elements. For example, although it’s possible to derive AGE from DATE OF BIRTH, it may not be a necessity to include it in the diagram. 

4. Limit use of weak entity sets

A weak entity set can’t be identified by the values of their attributes: they depend on another entity for identification. Instead of a unique identifier or primary key, you have to follow one or more many-to-one relationships, using the keys from the related entries as an identifier. It’s a common mistake for beginning database designers to make all entity sets weak, supported by all other linked entity sets. In the real world, unique ID’s are normally created for entity sets (e.g. loan numbers, driver license numbers, social security numbers) [2]. 

Sometimes an entity might need a little “help” with unique identification.  You should look for ways to create unique identifiers. For example, a dorm room is a weak entity because it requires the dormitory information as part of its identity. However, you can turn this weak entity into a strong once by uniquely identifying each room with its name and location [3].

Before you even consider using an entity set, double check to make sure you really need one. If an attribute will work just as well, use that instead [1].

4. Don’t connect relationship sets

Connecting relationship sets may make sense to you, but don’t do it. It isn’t standard practice. Connecting one relationship set to another is much like using a diamond to represent an entity. You might know what it means, but no one else will. Take this rather messy example of a wine manufacturer.

The “bought by” ” sold by” and “manfby” relationships are all connected. It could be that manufacturers buy their own wine back from themselves. Or, perhaps, sometimes manufacturers sell their own product. Whatever relationship is going on here, it’s confused and muddled by the multiple relationships.

Unless you want a meeting with your boss to explain what exactly your diagram means, leave creativity to the abstract artists and stick with standard practices.

References:

Images: By Author

[1] Chapter 2: Entity-Relationship Diagram

[2] Entity-Relationship Model.

[3] Admin: Modeling.

Source Prolead brokers usa

ai powered cyberattacks adversarial ai
AI powered cyberattacks – adversarial AI

In the last post, we discussed an outline of AI powered cyber attacks and their defence strategies. In this post, we will discuss a specific type of attack which is called adversarial attack.

Adversarial attacks are not common now because there are not many deep learning systems in production. But soon, we expect that they will increase. Adversarial attacks are easy to describe. In In 2014, a group of researchers found that by adding a small amount of carefully constructed noise, it was possible to fool CNN/ computer vision. For example, as below, we start with an image of a panda, which is correctly recognised as a  “panda” with 57.7% confidence. But by adding the noise, the same image is recognised as a gibbon with 99.3% confidence. For the human eye, both images look the same but for the neural network, the result is entirely different. This type of an attack is called an adversarial attack and it has implications for self driving cars where traffic signs could be spoofed.  

Source: Explaining and Harnessing Adversarial Examples, Goodfellow et al, ICLR 2015.

There are three scenarios for this type of attack:

  1. Evasion attack: this is the most prevalent sort of attack. During the testing phase, the adversary tries to circumvent the system by altering harmful samples. This option assumes that the training data is unaffected.
  1. Poisoning assault: This form of attack, also known as contamination of the training data, occurs during the machine learning model’s training phase. An opponent attempts to poison the system by injecting expertly produced samples, so jeopardizing the entire learning process.
  1. Exploratory attack: Exploratory attacks have no effect on the training dataset. Given Blackbox access to the model, they aim to learn as much as they can about the underlying system’s learning algorithm and patterns in the training data – so as to subsequently undertake a poisoning or an evasion type of attack

The majority of attacks including the above mentioned takes place in the training phase are carried out by directly altering the dataset to learn, influence, or corrupt the model. Based on the adversarial capabilities, attack tactics are divided into the following categories:

  1. Data injection: The adversary does not have access to the training data or the learning algorithm, but he does have the capacity to supplement the training set with new data. By injecting adversarial samples into the training dataset, he can distort the target model.
  2. Data manipulation: The adversary has full access to the training data but no access to the learning method. He directly poisons the training data by altering it before it is used to train the target model.
  3. Corruption of logic: The adversary has the ability to tamper with the learning algorithm. It appears that devising a counter strategy against attackers who can change the logic of the learning algorithm, so manipulating the model, becomes extremely tough.

During testing, adversarial attacks do not interact with the intended model, but rather push it to provide inaccurate results. The quantity of knowledge about the model available to the opponent determines the effectiveness of an assault. These assaults are classified as either Whitebox or Blackbox attacks. We present a formal specification of a training procedure for a machine learning model before considering these assaults.

White-Box Attacks

 

An adversary in a Whitebox assault on a machine learning model has complete knowledge of the model used (for example, the type of neural network and the number of layers). The attacker knows what algorithm was used in training (for example, gradient descent optimization) and has access to the training data distribution. He also understands the whole trained model architecture’s parameters . This information is used by the adversary to analyze the feature space in which the model may be vulnerable, i.e., where the model has a high mistake rate. The model is then exploited by modifying an input utilizing the adversarial example creation method, which we’ll go through later. A Whitebox assault that has access to internal model weights is equivalent to a very strong adversarial attack.

Black Box Attacks

A Blackbox attack assumes no prior knowledge of the model and exploits it using information about the settings and prior inputs. ‘In an oracle attack, for example, the adversary investigates a model by supplying a series of carefully constructed inputs and observing outputs.

Adversarial learning poses a serious danger to machine learning applications in the real world. Although there are certain countermeasures, none of them can be a one-size-fits-all solution to all problems. The machine learning community still hasn’t come up with a sufficiently robust design to counter these adversarial attacks.

References:   

A survey on adversarial attacks and defences

Source Prolead brokers usa

Pro Lead Brokers USA | Targeted Sales Leads | Pro Lead Brokers USA Skip to content