Search for:
the key to business survival in the era of ai
The Key to Business Survival in the Era of AI

Build a Continuously Adaptive Learning Organization (CALO)

Among the most perplexing challenges facing CEOs in recent years is how best to compete in a data-driven economy that is becoming increasingly dominated by large and rapidly growing companies, often described as “superstar” firms. Although the most valuable superstar firms are in the technology industry, the group includes hundreds of corporations in other industries that have grown rapidly during globalization.

Over the course of several dozen discussions with superstar firms and their competitors, unique characteristics in each company have become apparent, as have commonalities. The combination of vast customer bases, increasingly intelligent enterprises, large-scale supply chains, top-tier teams that include the majority of leading AI scientists, and comparatively infinite capital make superstar firms formidable opponents. It is unsurprising then that business owners and managers in the path of superstar firms express concerns about their daunting future. One common growth strategy in superstar firms is to leverage strengths to enter new industries, which causes considerable risk for companies of all sizes.

Opportunities for competitors of superstars exist nonetheless. In order to continue to grow at the same pace over the next decade, the fastest growing superstars would need to rapidly expand in other large industries, such as finance and education, which are highly regulated industries, or healthcare, which are government monopolies in much of the world. Amazon’s revenue, for example, grew by over $105 billion in 2020. Revenue growth for the group of superstars we track exceeded $1 trillion in 2020.

The rapid growth and market power of superstars causes complex challenges for society and governments, leading to increased antitrust enforcement, fines, and increased regulation. The reactions by markets and regulators is the primary reason why superstars in one business cycle are knocked out of the top decile by the next business cycle, which was confirmed by McKinsey & Company’s research on superstar firms. Given these headwinds, it will become increasingly difficult for superstar firms to maintain a similar growth rate moving forward, providing an opportunity for competitors. In addition, tens of billions of dollars of investment in AI research and development over the last decade has resulted in sharply lower cost of adoption with much higher performance levels, opening the option for most organizations to adopt advanced AI systems.

Evolution from a Learning Organizations to a CALO

In 1987, Peter Senge and his team argued that if leaders could build learning organizations that shared knowledge, collectively learned, and adapted to changing conditions, the organization would significantly increase probabilities of success. Today’s superstar firms are modern versions of learning organizations that aggressively apply this concept across their networked enterprise. Given the volume of internal and external information in the network economy, the ability to learn, adapt and execute requires sophisticated data management systems. We describe modern versions of these organizations as Continuously Adaptive Learning Organizations, or CALOs.

One example of a CALO is Amazon and their AWS division. The world’s largest ecommerce retailer learned valuable lessons from its online operations, leading to the creation of AWS, which provided Amazon a superior business model that delivered about 90 percent of the company’s cash flow in 2020. Another example is NVIDIA, which invented the graphics processing unit (GPU) in 1999. In 2012, AlexNet won the ImageNet competition, leading to the use of GPUs and convolutional neural networks. NVIDIA has since adapted by seizing the opportunity in machine learning to overtake Intel as the most valuable semiconductor company.

In the parcel delivery business, UPS learned from decades of operations research, resulting in a route-optimization program ORION that saves the company hundreds of millions of dollars annually while reducing 100,000 metric tons of greenhouse gas emissions every year.

Walmart learned from fierce competition with Amazon and other e-commerce leaders that they needed a new operating system (OS). Walmart reacted by making a bold decision to invest $3.3 billion to acquire Jet.com, providing a key ecommerce component to Walmart’s new type of OS. Four years later ecommerce sales at Walmart surpassed $40 billion annually and the company produced a $5 billion quarterly profit.

Although each of these companies represent a different type of CALO, they all demonstrated the ability to learn and adapt to highly complex, rapidly changing conditions in the network economy. These companies made significant investments in technology, adapted within the competitive time window and seized opportunities. Management in CALOs must master systems thinking, which is the fifth discipline in Peter Senge’s book by the same name. Achieving a CALO today requires management to view their organizations as complex adaptive systems, increasingly enhanced with AI.

While no organization has achieved a perfect CALO, some are clearly much closer than others. If it were possible to access sufficient data to create a CALO index across the public and private sectors, our engagements suggest that the highest performing superstars would achieve somewhere around 80% of a perfect CALO, and the lowest performing organizations would be about 10%.

Some organizations lack external incentives to learn and adapt, while others face hypercompetitive environments that require a functional CALO to survive. Superstars enjoy a combination of scale, talent, and technology, all unified within a networked enterprise and interconnected with their global ecosystem. Those organizations lacking a competitive enterprise should assemble components through partnerships, scale, talent and technology.

New innovations and business models can also be effective. Costco, for example, is among the most successful superstar firms in the retail industry. The company manages a large global supply chain and sophisticated interconnected enterprise, yet isn’t known for technology or ecommerce. To date Costco has performed well even as its competitor Amazon has grown much larger. Tesla provides an interesting case of a new company that pioneered new products in electric vehicles to become a superstar, despite entrenched competition and an exceptionally high cost of entry.

These and many other successful companies owe their success in part to achieving a functional CALO. Given the entrance of enterprise-wide AI systems with the capacity to quickly learn, the bar is rapidly rising. The majority of organizations will therefore need to substantially increase the functionality of their CALO, including superstar firms, as AI-enhanced competitors rapidly learn and adapt.

Powering the CALO with AI Systems

Although low-cost commoditized computing is necessary, ubiquity has a very high cost — it provides no competitive advantage. This IT commoditization paradox favors superstar firms with sufficient scale, wealth, and talent to compete in advanced technology. Superstar firms typically invest heavily in custom systems and applications to remain competitive. An alternative approach is to adopt an enterprise-wide data management system pre-designed to optimize machine learning, which can provide some of the advantages of custom software without the high cost of custom AI systems. Precision data structure tailored to the needs of each entity can leverage and augment human capital in the organization, which is ultimately the most important competitive advantage all others depend on.

Principled Management for Enterprise AI

We recently disclosed our 15 enterprise AI management principles, providing brief rationale and implications for each, which can serve as a management guide for enterprise AI (EAI) systems. In the AI era, the implications of planning and execution are profound. Even those organizations historically protected by market dominance or legislation are at risk. Whether businesses or nations, those who fall behind may not be in a position to honor their full obligations in the future.

Principle 1: EAI should have governance, ethics, and security built-in from inception.

Rationale: Good system design is paramount. Governance is the foundation AI systems are built upon.

Implications: Attempting to add-on governance, ethics, and security after the fact is technically difficult, inefficient, and prone to error.

Principle 2: Design-in systemic data quality management. 

Rationale: AI systems are only as good as the data they train on.

Implications: Garbage-in / garbage-out; quality-in / quality-out.

Principle 3: Maintain strong security.

Rationale: EAI includes the most important human workflow in the enterprise, including strategy, planning, and intellectual property.

Implications: Compromised EAI systems can be devastating to the organization.

Principle 4: Embed integrity throughout AI systems.

Rationale: Enterprise-wide AI systems interacts with the entire organization, providing a foundation for integrity.

Implications: If EAI systems lack integrity, so too will the organization.

Principle 5: Maintain objectivity.

Rationale: AI systems are an invitation for manipulation from political activists, fraud, and governments with totalitarian tendencies, resulting in misinformation that can lead to a dystopian future. Unless prevented, unconscious bias can also become embedded in AI through human workflow and algorithms.

Implications: Maintaining objectivity in EAI is essential. Cognitive bias is a significant risk in AI systems that has already been realized at scale.

Principle 6: Provide privacy, transparency and explainability.

Rationale: In order to earn and maintain trust, EAI systems should be transparent and explainable in a simple manner.
Implications: If employees, customers, and partners don’t trust the system, they are less likely to participate and/or attempt to work around the system.

Principle 7: Empower individuals.

Rationale: Organizations of every size and type consist of individuals.

Implications: Organizations that lose sight of the needs of individuals tend to experience low morale, high turn-over, and cultural deterioration.

Principle 8: Adopt enterprise-wide architecture.

Rationale: Value sometimes arises where least expected.

Implications: The one individual or sensor left out of EAI for the enterprise, supply chain, or ecosystem might be the one that recognized and alerted an existential risk.

Principle 9: Turbocharge prevention with EAI.

Rationale: The highest ROI possible is prevention of major crises.

Implications: Failure to prevent crises can lead to a negative spiral, failure, and collapse.

Principle 10: Accelerate discovery and productivity.

Rationale: Well-designed AI systems have a unique capacity to augment, enhance, and accelerate discovery, research, development, and execution.

Implications: It may not be possible to catch competitors that are more effective in applying EAI for productivity growth and R&D.

Principle 11: Establish and maintain a competitive advantage.

Rationale: AI systems learn quickly, innovation is accelerating, and the gap between AI leaders and laggards is rapidly expanding.

Implications: Organizations that fail to maintain competitive AI systems are at high-risk of falling behind, disruption, and displacement.

Principle 12: Leverage EAI for continuous human learning and improved decision making.

Rationale: Rapid machine learning only becomes an advantage when it improves human behavior.

Implications: Failing to rapidly learn in the AI era risks extinction.

Principle 13: Mitigate risk in development, auditing, and monitoring EAI.

Rationale: How and who designs, audits, and monitors AI systems can determine the outcome.
Implications: A significant percentage of large enterprise crises have been due to internal actors (“don’t try this at home”).

Principle 14: Incentivize accuracy and improvement with knowledge currency.

Rationale: Data intelligence within the system allows more accurate financial and psychological rewards with digital currency, which can serve to guide organizational and individual behavior.

Implications: Organizations that fail to recognize and reward top performers tend to lose them to competitors.

Principle 15: Continuously adapt.

Rationale: “The species that survives is the one that is able best to adapt and adjust to the changing environment in which it finds itself.” – Charles Darwin, The Origin of Species.

Implications: The pace of change appears to be accelerating. Organizational cultures that master AI systems for continuous adaptation are most likely to thrive in the future.

Balancing Ethics with Existential Risk

Organizations are faced with difficult strategic and ethical choices in the era of AI. At one extreme, governments from all over the world are currently faced with the decision of whether to allow their military to employ automated weapons at large scale. The rationale and implications for senior military officers are relatively simple – ethical considerations are not. Is it more ethical to embrace banning automated weapons given difficulties of enforcement and risk to nations, or authorize automated weapons and focus on precision governance?

Business leaders are faced with similar existential questions and must balance ethics with fiduciary responsibilities and perhaps even survival. Although most businesses are not yet faced with immediate extinction, strategic decisions on AI can have profound impacts. Machine learning is rapid and begins immediately upon adoption, providing an incentive to make bold investments early in the process of commercialization, yet some companies that jumped in early have wasted billions of dollars. Many other companies have experimented with machine learning projects rather than embarking on a bold journey in AI systems, just to learn that a systems approach is necessary to be competitive.

Regardless of size or type of organization, leaders will need to balance ethics with performance and strategic imperatives in the AI era. Otherwise, their organizations may not survive long enough to make future positive impacts for shareholders, employees, customers and society. In our many discussions with leaders surrounding enterprise AI systems, although a variety of different strategies have been executed and many mistakes made, the highest performers demonstrated the ability to quickly learn and adapt. Whether organizations have ambitions to become a superstar firm or must defend against them, the goal should be to first achieve a CALO enhanced with AI.

References

[1] “The Rise of Superstars,” The Economist, Special Report, September 17th, 2016. www.economist.com

[2] M. Bajgar, S. Calligaris, C. Criscuolo, l. Marcolin, and J. Timmis “Superstars Are Running Away With The Global Economy,” Harvard Business Review, November 14, 2019. https://hbr.org

[3] D. Autor, D. Dorn, L. Katz, C. Patterson, J.V. Reenan, “The Fall of the Labor Share and Rise of Superstar Firms,” Quarterly Journal of Economics 135, no. 2 (May 2020): 645–709.

[4] J. Manyika, S. Ramaswamy, J. Bughin, J. Woetzel, M. Birshan and Z. Nagpal, “‘Superstars’: The dynamics of firms, sectors, and cities leading the global economy,” McKinsey Global Institute, October 24, 2018, www.mckinsey.com.

[5] R. Heston and R. Zwetsoot, “Mapping U.S. Multinationals’ Global AI R&D Activity,” Center for Security and Emerging Technology, December, 2020. https://cset.georgetown.edu

[6] P. Senge, “The Leader’s New Work: Building Learning Organizations,” MIT Sloan Management Review, October 15, 1990. https://sloanreview.mit.edu

[7] S. Rosenbush, L. Stevens, “At UPS, the Algorithm is the Driver,” The Wall Street Journal, February 16, 2015. https://www.wsj.com

[8] M. Montgomery, KYield, “Why Every Company Needs a New Type of Operating System Enhanced with AI,” September 10, 2017. https://kyieldos.com

[9] “The Fifth Discipline: The Art and Practice of the Learning Organization,” P. Senge (New York: Doubleday Currency, 1990).

[10] J. S. Lansing, “Complex Adaptive Systems,” Annual Review of Anthropology, no. 32 (October, 2003): 183-2004

[10][10] J. Dastin, P. Dave, “U.S. commission cites ‘moral imperative’ to explore AI weapons,” Reuters, January 26, 2021. https://www.reuters.com

[12]  T. Levin, “One of Uber’s earliest investors says the billions it spent on self-driving were a ‘waste of money’,” Business Insider, February 2, 2021. https://www.businessinsider.com

[13]  B. Ghosh, P. R. Daugherty, H. J. Wilson, and A. Burden, “Taking a Systems Approach to Adopting AI” HBR, May 9, 2019. https://hbr.org

Source Prolead brokers usa

10 top mobile application development trends you cant miss
10 Top Mobile Application Development Trends You Can’t Miss

The years 2019 & 2020 saw unprecedented growth in Mobile App development. By our estimate, mobile app development will be the top area in the year 2021. However, are you aware of these trends? If not, you are at risk of losing out to your competitors, who are already doubling down on incorporating these trends into their mobile apps. Every business or brand wants to roll out a mobile app and is hiring a mobile app development company.

According to Statista, in 2021, over 140 billion mobile apps will be downloaded from both the App Store and Google Play store.

The data indicate the growing popularity and acceptance of mobile apps and are a requirement for a business today. In this blog, we will talk about a few areas or trends that you, as a business, must keep in mind so that you do not miss business growth. Ready or not, here we go with them!

#1. Internet of Things (IoT)

It is wonderful to have your assistant help you with mundane tasks. The growing mobile penetration and desire to integrate technology with every aspect of our lives has led to the prevalence of seamless integration and inter-connectedness of devices with the internet. As per the recent report by Global Data, 60% of the $220bn IoT market will come from software such as mobile apps.

Now, you can switch on your Air Conditioner, even before you reach your home, by accessing it on your mobile phone. The Smart House is probably the best example of that. With a swipe of your finger, you can control or access your house.

#2. Mobile e-Commerce

Though it is not exactly the latest trend; however, we feel it would see lots of improvement this year, and we felt we must discuss it in this blog. Today over 72.9% of the online purchases are done over mobile phones; no surprise that the e-commerce giants, SMEs, luxury brands, Personal brands are trying to edge past each other to reach the market.
Therefore, if you are running an e-commerce business, hire a top Mobile development company because you may not fully understand the Mobile app development process.

#3. Cloud-Based Apps

Cloud services have revolutionized the development process for everyone. From massive savings to portability, you can harness the cloud potential to grow your business. Cloud offers a virtual storage and access facility. It means that now you do not have to download the app on your mobile phones. You can access the mobile app from the cloud. An added advantage of Cloud-based apps is that you have to create one app for iOS and Android platforms, saving you development costs.

#4. Smart Wearable

If you are in a business, which sees potential in Smart Wearable technology, you better hustle to integrate the key potentialities of your smart wearable app with mobile. Just consider the fitness industry. People want to keep an eye on their health data, steps taken, etc. They wear bands or even have mobile applications to measure and monitor their health. As per www.businesswire.com, the market for Smart Wearable will increase by almost 20% in the next five years.

#5. Enterprise Mobile Apps

More and more businesses are investing in creating a customized mobile app for themselves. An enterprise mobile app gives you access to analytics and data on the go. We are not suggesting that the desktop applications will become passé. We feel that a business can gain an advantage by incorporating the mobile app along with the desktop apply to enhance work productivity and reduce costs.

#6 Decentralized Apps (DApps)

The blockchain platform is not ideal just for financial transactions. You can harness the salient features of the Blockchain platform to get a customized application for your business. You get robust security on the Blockchain platform, which keeps your data safe and secure. The decentralized feature of the platform ensures complete transparency and track of all the transactions.

#7 Machine Learning & Artificial Intelligence

Apple Inc. is leading the race to utilize Machine Learning and AI technologies in its products. We are sure that you must have used ‘Siri’ or ‘Alexa’ to search for something or asked a query. There are rapid advancements in mobile app development, which will allow for incorporating more nuanced and enhanced features in mobile apps. There are camera-based apps with smart recognition features which are in use today.

#8 On-demand Apps

If you have ever ordered a ride on Uber or a snack on Zomato, you have already used an On-demand mobile app. In the future, you will see a whole bouquet of services that are ordered using their custom-built mobile application. Whether it’s yoga or a Salsa class, people are increasingly getting comfortable with online training. So, if you are in the Services business, you must have a customized mobile-based application, in addition to the standard website.

#9 5G Technology

Beyond the 4G technology, mobile network companies have already rolled out 5G technology that offers high-speed internet access to their customers. Even the mobile phone companies are offering 5G compatible smartphones. The future of the internet is a 5G network. The mobile apps that are developed must be 5G compatible to offer a seamless experience to the users.

#10 Beacon Technology

Proximity Marketing will become the norm in the future. Businesses will market their product to the customers passing by their business. The technology relies on accessing the Bluetooth functionality of mobile phones to alert the customers about special discounts or flash sales.

We feel that for any business to stay relevant and grow, it is imperative to be aware of the future mobile app technology trends. We also understand that your business requires your constant attention, which is why we suggest that you rely on the expertise of a Mobile Application Development Company to help you breach the frontier of growth for your business. Now, there are many Mobile Application Development Companies in India. We suggest you hire one, which has over a decade of experience working with international clients. Do keep in mind to ask how many projects they have successfully delivered in the past five years and their industry footprint.

We hope that we could provide you with value with this read. We constantly strive to inform you about the latest in app development technology and process. Your feedback is our fuel. Do leave your comments about this blog. We will appreciate it that you share this blog with someone who cares about technology as much as you and we do.

Source Link

Source Prolead brokers usa

PLM of the future (look no further, it is here!)

The COVID-19 pandemic has created a massive shift in the way we work. With social distancing and frequent lockdowns to contend with, a large portion of the global workforce is operating from home. In the US, 58 percent are working remotely from home,[i] a clear reflection of the distributed nature of work that will dominate the future. Fortunately, most enterprise and other systems today can go virtual. Distance doesn’t matter. The Tour de Suisse, a bicycle race, has gone virtual with shades of Mixed Reality (MR)[ii]–and if that doesn’t surprise you, heavy duty mining equipment in Australia has been operated by experts sitting 1,500 km away, in the safety of their offices, for a few years now.[iii] The mining teams can just as easily do this sitting at a desk at home. If such a vast variety of roles and operations can go remote/ virtual, why should Product Lifecycle Management (PLM) be any different?

PLM has already helped organizations cut down costs and has enabled faster creation of innovative products. Now, as COVID-19 wreaks havoc across the globe, PLM can rise to the rescue of organizations operating in industries such as Manufacturing, Consumer Packaged Goods (CPG), Retail, Life Sciences, etc. PLM applications are more critical today than ever before because they can help businesses rapidly adjust their production lines to meet changing customer needs.

The problem is that PLM has its roots in manufacturing with a closed set of users. The challenge is twofold: one, to extend PLM so it goes beyond the engineering team and becomes available for collaboration with other functions like manufacturing, quality, procurement, services, sales and marketing and the leadership as well; two, PLM was designed to manage product lifecycle integrating with upstream CAD and downstream ERP systems, however today it needs to integrate with new technologies like Mobility, Advance Analytics, Internet of Things (IoT), Automation/ RPA, Augmented Reality (AR), Virtual Reality (VR) and Additive Manufacturing etc.

Industries where PLM implementation is mature can easily make the necessary changes—this means an industry like Automotive and Transportation where 23.8% have adopted PLM or Industrial Equipment where 20.2% have adopted PLM can be first movers and lead the way.iv Forecasts suggest that industries such as Life Science, Consumer Goods and Retail will increase adoption, with PLM growing at the highest CAGR of around 8% across these industries. v These industries too should adjust their PLM strategies to become future ready. With technology partners, who are well versed in PLM implementations, this is not difficult to achieve.

A well-known business specializing in performance running shoes and athletic wear, based in Seattle, makes a good example of how even mid-sized companies can update their PLM quickly so they can continue to respect the health guidelines and social distancing norms imposed by COVID-19. The organization’s complex processes require designers to work with new materials and for teams to test these designs before they go for production. To do this, the organizations has incorporated 3D design and modelling into its PLM processes. Its designers use high resolution images of materials to create new products, cutting down on the need for sample requirements.vi Their real-time data platform and modern UIs, integrated with PLM, are at the core of better collaboration between teams that manage schedules, plans, sourcing, design, testing and design approvals for productions. Step-by-step they have overcome the limitations imposed by COVID-19. Now, they even have role-based applications to extend the ability of their PLM. These apps cater to the needs of individual users, improving collaboration, productivity, and agility (see figure below).

Improving productivity, efficiency and business agility by extending PLM with roles-based applications

Mobility Apps

Automation Apps

Process Apps

Reporting Apps

Tracker Apps

Provides flexibility to work from anywhere, over any device. Users can capture images and share across functions

Replaces repetitive tasks with RPA to provide 20 to 25% faster resolution for service requests

Creates end-to-end business processes to improve turnaround time

Provides end-to-end visibility and facilitates faster and more accurate decision-making by leveraging analytics

Enables status updates, helps identify bottlenecks and reduces delays by 15 to 20%

The strategy is easily replicable. The bigger challenge before organizations is to ensure that while these changes are being made and the ability of PLM is extended, it keeps the future firmly as a guidepost. The investments being made today must stay relevant to the business needs two to three years into the future. Businesses that do this will thrive in the long-term.

[i]    https://www.forbes.com/sites/johnkoetsier/2020/03/20/58-of-american-knowledge-workers-are-now-working-remotely/#3034dc3b3303

[ii]  https://www.youtube.com/watch?v=cmj5lZkIUbI

[iii] https://www.zdnet.com/article/rio-tinto-preparing-for-the-mine-of-the-future-with-automation/

iv   https://www.pdsvision.com/wp-content/uploads/2019/05/Quadrant-Market-Outlook.pdf v   https://www.pdsvision.com/wp-content/uploads/2019/05/Quadrant-Market-Outlook.pdf vi   https://youtu.be/s8tbAPYd1Z4

  

About the author: Akhil Jain is VP, PLM, at ITC Infotech. He is responsible for managing the PLM practice and delivery for Manufacturing and CPG globally and comes with strong technology consulting and implementation experience in PLM packages focused on Manufacturing, Retail and CPG. 

 

Source Prolead brokers usa

what i learned from 25 years of machine learning
What I Learned From 25 Years of Machine Learning

Source: here

Here is what I learned from practicing machine learning in business settings for over two decades, and prior to that in the academia. Back in the nineties, it was known as computational statistics in some circles, and some problems such as image analysis were already popular. Of course a lot of progress has been made since, thanks in part to the power of modern computers, the cloud, and large data sets now being ubiquitous. The trend has evolved towards more robust and model free, data-driven techniques, sometimes designed as black boxes: for instances, deep neural networks. Text analysis (NLP) has also seen substantial progress. I hope that the advice I provide below, will be helpful in your data science job. 

11 pieces of advice

  • The biggest achievement in my career was to automate most of the data cleaning / data massaging / outlier detection and exploratory analysis, allowing me to focus on tasks that truly justified my salary. I had to write of few re-usable scripts to take care of that, but it was well worth the effort. 
  • Be friend with the IT department. In one company, much of my job consisted in producing and blending various reports for decision makers. I got it all automated (which required direct access via Perl code to sensitive databases) and I even told my boss about it. He said that I did not work a lot (compared to hard-workers) but understood and was happy to always receive the reports on time automatically delivered in his mailbox, even when I was in vacation.
  • Leverage API’s. In one company, a big project consisted of creating and maintain a list of the top 95% keywords searched for on the web, and attach a value / yield to each of them. The list had about one million keywords. I started by querying internal databases, then scraping the web, and develop yield models. There was a lot of NLP involved. Until I found out that I could get all that information from Google and Microsoft by accessing their API’s. It was not free, but not expensive either, and initially I used my own credit card to pay for the services, which saved me a lot of time. Eventually my boss adopted my idea, and the company reimbursed me for these paid API calls. They continued to use them, under my own personal accounts, long after I was gone. 
  • Document your code, your models, every core tasks you do, with enough details, and in such a way that other people understand your documentation. Without it, you might not even remember what a piece of your own code is doing 3 years down the road, and have to re-write it from scratch. Use simple English as much as possible. It is also good practice, as it will help you train your replacement when you leave.
  • When blending data from different sources, adjust the metrics accordingly, for each data source; metrics are likely to not be fully compatible or some of them missing, as things are probably measured in different ways depending on the source. Even over time, the same metric in the same database can evolve to the point of not being compatible anymore with historical data. I actually have a patent that addresses this issue.
  • Be wary of job interviews for a supposedly wonderful data science job requiring a lot of creativity. I was misled quite a few times, the job eventually turned out to be a coding job. It can be a dead-end, boring job. I like doing the job of a software engineer, but only as long as it helps me automate and optimize my tasks.
  • Working remotely can have many rewards, especially financial ones. Sometimes it also means fewer time spent in corporate meetings. I had to travel every single week between Seattle and San Francisco, for years. I did not like it, but I saved a lot of money (not the least because there is no employment tax in Washington state, and real estate is much less expensive). Also, walking from your hotel to your workplace is less painful than commuting, and it saves a lot of time. Nowadays telecommute makes it even easier. 
  • Embrace simple models. Use synthetic or simulated data to test them. For instance, I implemented various statistical tests, and used artificial data (many times from number theory experiments) to fine-tune and assess the validity of my tests / models on datasets for which the exact answer is known. It was a win-win: working on a topic I love (experimental and probabilistic number theory) and at the same time producing good models and algorithms with applications to real business processes.
  • Being a generalist rather than a specialist offers more career opportunities, within your company (horizontal move) or anywhere. You still need to be an expert in at least one or two areas. As a generalist, it will be easier for you to become a consultant or start your own company, should you decide to go that route. Also, it may help you understand the real problems that decision makers are facing in your company, and have a better, closer relationship with them. Or with any department (sales, finance, marketing, IT).
  • In data we trust. I disagree with that statement. I remember a job at Wells Fargo where I was analyzing user sessions of corporate clients doing online transactions. The sessions were extremely short. I decided to have my boss do a simulated session with multiple transactions, and analyze it the next day. It turned out that the session was broken down into multiple sessions, as the tracking services (powered by Tealeaf back then) started a new session anytime an HTTP request (by the same user) came from a different server (that is, pretty much for every user request). The Tealeaf issue was fixed when notified by Wells Fargo, and I am sure this was my most valuable contribution at the bank. In a different company, reports from a third party were totally erroneous, missing most page views in their count: it turned out that their software was cutting every URL that contained a comma: a glitch caused by bad programming by some software engineer at that third party company, combined with the fact that 95% of our URL’s had contained commas. If you miss those massive glitches (even though in some ways it is not your job to detect them), your analyzes will be totally worthless. One way to detect these glitches is to rely on more than just one single data source.
  • Get very precise definitions of the metrics you are dealing with. The fact that there are so many fake news nowadays is probably because the concept of fake news has never been properly defined, rather than a data / modeling issue.

To receive a weekly digest of our new articles, subscribe to our newsletter, here.

About the author:  Vincent Granville is a data science pioneer, mathematician, book author (Wiley), patent owner, former post-doc at Cambridge University, former VC-funded executive, with 20+ years of corporate experience including CNET, NBC, Visa, Wells Fargo, Microsoft, eBay. Vincent is also self-publisher at DataShaping.com, and founded and co-founded a few start-ups, including one with a successful exit (Data Science Central acquired by Tech Target). He recently opened Paris Restaurant, in Anacortes. You can access Vincent’s articles and books, here.

Source Prolead brokers usa

data warehouse data lake data mart data hub a definition of terms
Data Warehouse, Data Lake, Data Mart, Data Hub A Definition of Terms

In today’s business environment, most organizations are overwhelmed with data and looking for a way to tame the data overload and make it more manageable to help team members gather and analyze data and make the most of the information contained within the walls of the enterprise. When a business enters the domain of data management, it can often get lost in a morass of terms and concepts and find it nearly impossible to sort through the confusion. Without a clear understanding of the various categories and iterations of data management options, the business may make the wrong choice or become so mired in the review process that it will give up its quest.

 

This article is the first of two on the topic of Data Management. Here, we will define the various terms so that a business can more easily understand the types of data management solutions and tools. In the second of these two articles entitled, ‘Factors and Considerations Involved in Choosing a Data Management Solution’e discuss the various factors and considerations that a business should include when it is ready to choose a data management solution.

 

Data Warehouse

A Data Warehouse (AKA Datawarehouse, DWH, Enterprise Data Warehouse or EDW) solution is designed to centralize and consolidate large bodies of data from disparate, multiple sources and is meant to help users execute queries, perform analytics, provide reporting, and obtain business intelligence. Data Warehouse data is typically comprised of data from applications, log files and historical transactions and integrates and stores data from relational databases and other data sources originating in various business units and operational entities within the enterprise, e.g., sales, marketing, HR, finance.

 

A Data Warehouse is a structured environment that is comprised of one or more databases and organized in tiers. An interactive, front-end tier provides search results for reporting, analytics and data mining. The search engine accesses and analyzes the data for presentation and the foundational architecture or database server provides the storage and loading repository.

In order to prepare data for analysis, a Data Warehouse environment will typically utilize an Extraction, Transformation and Loading (ETL) process to prepare data for analysis. Team members who access a Data Warehouse may use SQL queries, analytical solutions or BI tools to mine the data, report, visualize, analyze and present the data. 

 

Data Mart

We can think of a Data Mart as a subset of a Data Warehouse but, whereas a Data Warehouse is an enterprise-wide solution that comprises data from across the organization, the Data Mart is a structured environment that is used to store and present data for a specific team or business unit. This approach allows a business team or unit to curate, leverage and manipulate data that is specific to their teams. For example, a business might create a Data Mart to serve its Marketing, Sales and Advertising teams or it might expand that use to include Customer Service and Product teams so that it can more easily analyze and collaborate using data culled from specific sources within these business units.

 

While Data Warehouses access and analyze large volumes of records, a Data Mart improves the response time and performance for end-users by refining the data to provide only data that will support the collective needs of a specified group of users.

 

Think of a Data Mart as a ‘subject’ or ‘concept’ oriented data repository. A Data Mart often provides a subset of data from a larger Data Warehouse and is designed for ease of consumption, to produce actionable insight and analysis for a particular group.

 

Data Lake

A Data Lake is a less structured and more flexible approach to data management with data streaming in from various sources and a more free-wheeling approach to data access, exploration and sampling. A Data Lake stores data with no organization or hierarchy. All data types are stored in raw form or semi-transformed format and data is only organized for presentation and use as queries or requests are generated.

 

A Data Lake can store structured (relational databases, rows columns), semi-structured (XML, ISON, Logs, CSV) and unstructured or binary (Word documents, PDF formats, images, email, audio or vide0) data, and acts as repository of various data sources and users can use that data for various types of analytics from visualization to dashboard presentation, machine learning and data processing.

 

Data Hub

A Data Hub solution is typically a more flexible, personalized approach to data management with various integration technologies and solutions overlaid to provide the structure or output needed by the business. The data flows from various sources – not all of which will be operational. A Data Hub can provide data in various formats and perform actions to refine data for quality, security, duplicate removal, aged data, etc.

 

The Data Hub is meant to collect and connect data to produce insight for collaboration and data sharing. It will act as an integration and data processing hub to connect data sources and make them more readily accessible and usable for team members. The definition of a Data Hub will vary by business use and by organization as the parameters and organization of the hub environment will flex to the needs of the organization. So, factors like available models, data governance and access, data persistence and analytical formats and reporting options will vary.

 

As you consider the various solutions and options for data management, be sure to develop and use a comprehensive and detailed set of requirements and elicit feedback from those who will use and manage the solution.

 

Now that you understand the various Data Management options, you are ready to select an option for your business. The second of our two-article series, entitled, ‘Factors and Considerations Involved in Choosing a Data Management Solution’ will provide some simple suggestions and recommendations to help you choose the right option.

Source Prolead brokers usa

your guide for the commonly used machine learning algorithms
Your Guide for the Commonly Used Machine Learning Algorithms

We are currently living in such a period where computing has transformed immensely from large mainframes to personal computers to the cloud.  The constant technological progress and the evolution in computing resulted in major automation.

In this article, let’s understand few commonly used machine learning algorithms. These can be used for almost any kind of data problem.

  1. Linear Regression
  2. Logistic Regression
  3. Decision Tree
  4. SVM algorithm
  5. Dimensionality Reduction Algorithms
  6. Gradient Boosting algorithms and AdaBoosting Algorithm
  • GBM
  • XGBoost
  • LightGBM
  • CatBoost
  1. Linear Regression

This is used to estimate real values like the cost of houses, number of calls, total sales, and many more based on a continuous variable. In this process, a relationship is established between independent and dependent variables by fitting the best line. This best fit line is called the regression line and is represented by a linear equation Y= a *X + b.

In this equation:

  • Y – Dependent Variable
  • a – Slope
  • X – Independent variable
  • b – Intercept

The coefficients a & b are derived depending on minimizing the sum of squared difference of distance between data points and regression line.

  1. Logistic Regression

This is used to evaluate discrete values (mainly Binary values like 0/1, yes/no, true/false) depending on the available set of the independent variable(s). In simple terms, it is useful for predicting the probability of happening of the event by fitting data to a logit function. It is also called logit regression.

These below listed can be tried in order to improve the logistic regression model

  • including interaction terms
  • removing features
  • regularize techniques
  • using a non-linear model
  1. Decision Tree

This is highly used for classification problems. The Decision Tree algorithm is considered among the most popular machine learning algorithm. This perfectly works for both continuous and categorical dependent variables. This is done depending on the most significant attributes/ independent variables to create as distinct groups as possible.

  1. SVM algorithm

This algorithm is a classification method in which the raw data gets plotted as points in n-dimensional space (where n is the number of features that are present). The value of every feature is being the value to a particular coordinate. This makes it quite easy to classify the data. For an instance, if we consider two features like hair length and height of an individual. First, these two variables will be plotted in the two-dimensional space, where every point has two coordinates, these are called Support Vectors.

  1. Dimensionality Reduction Algorithms

Over the last few years, huge amounts of data are been captured at every possible stage and are getting analyzed by many sectors. The raw data also consists of many features but the major challenge is in identifying highly significant variable(s) and patterns. The dimensionality reduction algorithms like Decision Tree, PCA, and Factor Analysis help find the relevant details depending on the correlation matrix, missing value ratio.

  1. Gradient Boosting algorithms and AdaBoosting Algorithm

GBM – These are boosting algorithms that are highly used when huge loads of data have to be taken care of to make predictions with high accuracy. AdaBoost is an ensemble learning algorithm that mixes the predictive power of various base estimators to improve robustness.

XGBoost – This has a major high predictive analysis that makes it the most suitable choice for accuracy in events as it possesses both tree learning algorithms and linear models.

LightGBM – This is a gradient boosting framework that uses tree-based learning algorithms. The framework is a very quick and highly efficient gradient boosting one based on decision tree algorithms. It is designed to be distributed with the mentioned benefits:

  • Parallel and GPU for machine learning supported
  • Quicker training speed and better efficiency
  • Lower memory usage and enhanced accuracy
  • Capable of handling large-scale data

CatBoost – This is an open-sourced machine learning algorithm. It can easily integrate with deep learning frameworks such as Core Ml and TensorFlow. It can work on various data formats.

Whoever is seeking a career in machine learning should understand and increase their knowledge of these algorithms.

Source Prolead brokers usa

why enterprise data planning is crucial for faster outcomes
Why Enterprise Data Planning Is Crucial for Faster Outcomes

Have you ever sat in a meeting where everyone has a different number for the same performance measure? This typically results in spending the next hour trying to reconcile the differences rather than making the important business decisions required. Upon further analysis, it is likely everyone will have the right number according to the system from which it was derived.

The differences can likely be attributed to inconsistent hierarchical master data across these systems. It has existed ever since organizations start implementing more than one business system. But today, the problem is magnified across the many systems most organizations have and by the large numbers of changes today’s business environment generates. It is therefore essential for organizations to effectively manage hierarchical master data across multiple information systems. Organizations need to move beyond the mix of email, spreadsheets and adhoc systems that many currently rely on to execute this extremely important function. Numerous organizations are looking for enterprise software solutions to help them effectively manage these problems without relying on manual processes.

Why is Enterprise Data Planning Important? 

Data is usually shared across many enterprise systems. For example: John (Sales Representative) who works in California (Territory) sells 10,000 (Quantity) of a new widget (Product) to a customer (Customer) based in New York (Geography) for $50,000 (Total Sale) on December 15, 2017 (Date). Taken together, this information is about one transaction, but included in the transaction are individual elements of master data—Sales Representative, Territory, Quantity, Product, Customer, Geography, Total Sale and Date.

Today’s Enterprise Data Planning Challenges

How do most enterprises manage enterprise data today? Remarkably for something so important, they do it through conversations, telephone calls, spreadsheets and e-mail. For example, if a departmental manager wants to add another cost center, or if management wants to move facilities from human resources to finance, the business decision must first be approved by all the relevant decision makers. This takes time. Once the change is approved, IT receives the request to make the change and ensure that it ripples through all of the enterprise’s transactional systems, data warehouses, business intelligence and enterprise performance management solutions. Because changes are made manually, often the end result is a lot of people making a lot of mistakes with a lot of mission critical data—mistakes that go undiscovered due to a lack of visibility or traceability in the process. This is compounded by the sheer number of changes that take place in enterprises today. We constantly cite the increasing rate of change in business which inevitably leads to increasing change in enterprise data.

Modern Enterprise Data Management 

World-class performers experience significant benefits from taking a modern, agile approach to enterprise data management across their entire business systems landscape. Key characteristics of this approach include: 

• Eliminating the need for a formal, upfront data governance program and initiative that requires burdensome commitments including executive sponsorship, agreement on terms and definitions, enterprise policies, and a host of other coordination costs between Business and IT to orchestrate people, time and resources across lines of businesses, divisions or geographies. 

• Taking an elastic approach to managing enterprise data that is evolutionary, iterative, incremental and flexible. One that does not force mastering to achieve desired outcomes, but is fit for purpose based upon desired scope: peer-to-peer within a small workgroup, application-to-application to support local alignment, or enterprise-wide to enable global mastering initiatives as desired based upon the aspirations, capabilities, and maturity of an organization at a point in time. 

• Facilitating easy-to-use, web-based, self-service experiences for streamlined application maintenance, collaborative change management, faster data sharing, and accelerated new application development.

 • Utilizing a request-driven approach to all change management and data hygiene activities in an easy-to-use, self-service experience that promotes timely, accurate changes across a spectrum of business users.

 • Employing a business-driven approach to snapshot historical versions, branch off production data sets to explore what-if scenarios, and merge approved plans into production in a timely manner to drive value among connected business applications. 

• Comparing alternate business perspectives within and across applications to understand differences, and rationalize on a fit-for-purpose basis. 

• Streamlining last mile integration with connected business applications, across public, private, and hybrid cloud environments. 

• Have fully transparent activity trails that enable regulatory compliance and risk mitigation.

PromptCloud can help you in aggregating the data from the web using advanced scraping techniques. Enjoy 100% quality data at the frequency of your choice.

Source Prolead brokers usa

data science a good career choice
Data Science: A Good Career Choice

Data science can be considered to be a new buzzword in the world of technology. Data scientists and big data engineers hold the promise of high pay and excellent job growth. In order to explore this beautiful world of data science, it is essential to know:   

  • What is data science?  
  • Is data science a good career option?
  1. What is data science?  

Data scientists research the source of information, how data fit with each other to create a story, and what the patterns stand for and how they help understand business results.          

On a daily basis, data science means creating statistical models to make projections, econometrics, classification, clustering, simulations, and other objectives, prediction of user behavior through pattern or trend identification, thorough data analysis, conveying of data insights through data visualizations, and data summarization.   

  1. Is data science a good career option? 

Data science has experienced a 650% growth in jobs starting from 2012. Thus, the data science field is experiencing a huge rise in demand and is a good career choice.   

Data science can be considered as an intellectually demanding area of study and work. Much time is required to clean the data, import large datasets, build databases, and maintain dashboards. In order to stand tall as a data scientist, you are required to enjoy quantitative areas and be available about enabling firms to make data-driven decisions.  

As per LinkedIn, SQL is the essential skill asked for for data science jobs. In addition to this, Hadoop and Spark are also at a rise in popularity. You will be required to learn a programming language like Python, SAS, R. Also, you should revise your mathematical skills with a focus on probability and statistics, multivariate calculus, linear algebra. It would help if you also learned data visualization tools, for example, Tableau.  

It is also recommendable to learn coding as a beginner, as even a change in parameter can change results, and there is very little margin for making mistakes. As you advance in your career as a data scientist, you may choose to specialize in machine learning algorithms, natural language processing, and deep learning, many other related areas that have a basis in unstructured data and big data.  

To be successful as a data scientist, you should also possess soft skills like storytelling, teamwork, interpersonal skills, and communication skills. These skills can often not be grasped so well by textbooks but instead develops on the job in collaboration with the tech teams, stakeholders across product, business, and others.  

Benefits of studying and practicing data science  

The several advantages of data science areas give below:  

  1. Data science is in high demand  

According to LinkedIn, it is the fastest-growing career, and it is stated that there will be created 11.5 million data science jobs by 2026. This proves that it is a high-demand job area.   

In addition to the above information, it can be stated that only a very few people have the skills that make up a data scientist. This turns data science into a less saturated area as compared to other information technology areas. Data science is an exceedingly abundant area of work with tons of opportunities. This is so also due to the low supply of data scientists.  

  1. Career and payment in data science  

If you have a career in data science, you will be eligible for a highly paid job position. As per Glassdoor, data scientists earn on average USD 116,100 per annum. Thus, they make a very lucrative job position.   

  1. Data science is a versatile field  

There are a huge number of applications of data science. It is mainly used in banking, health- care, e-commerce industries, consultancy services. Thus, the applicability of data science is versatile. It will thus allow you to work in several fields.  

  1. Data science turns data to a better state  

Firms require skilled data scientists for processing and analyzing their data. Thus, they improve the quality of data as well as analyze it. For this reason, they work with enriching data and making it more useful for their company.  

This is also the reason why data scientists are so crucial for the firm. They make better business decisions. The firms are dependent on them and use their expertise to offer better solutions to the clients. This is the reason why they hold a special position in the company.  

  1. Freedom from boring tasks  

Data scientists have enabled several industries to perform automation of redundant tasks. Firms are making use of historical data to train machines in a way that they can perform repetitive activities. This has made simpler the arduous jobs down by people before.  

Conclusion   

You can get online certified as a Data engineer and data scientist from a highly recommended platform for the same: dasca.org. Your career shall be at definite rise with these globally valued certificates.

 

Source Prolead brokers usa

unleashing the business value of technology part 4 delivering value
Unleashing the Business Value of Technology Part 4: Delivering Value

We are now ready to wrap up the four-part series on how technology vendors – especially data and analytic technology vendors (and what technology vendors are not involved in data and analytics nowadays) – can help their customers to “unleash the business value of their technology investments.”

I wrote this four-part series because in my journeys these past few months, both technology customers and technology vendors bemoaned their frustrations with deriving business value from their technology investments.  And senior IT leadership in particular were coming under increased scrutiny about delivering on the promise of these technology investments.

In Part 1, I introduced the 3 stages of “unleashing the business value” roadmap:  1) Connecting to Value, 2) Envisioning Value and 3) Delivering Value (see Figure 1).

Figure 1: Unleashing the Value of Technology Roadmap

In Part 2, I provided some techniques that enable technology vendors to connect to “value”, but that is “value” as defined by the customer’s business initiatives, not “value” as defined by the technology vendor’s product or services capabilities.  I introduced two fundamental techniques for vendors who want to connect with their customers’ sources of value creation:

  • Step 1: Understanding how your customers make money.  This requires investing the time upfront to research a customer’s key business initiatives and their supporting business and operational use cases.
  • Step 2: Triaging Your customer’s business initiatives.  I introduced the Big Data Strategy Document for helping technology vendors understand where and how data and analytics might support their customers’ key business initiatives.

In Part 3, I reviewed some techniques to help customers “envision” the realm of what’s possible with respect to how data and analytics to derive and drive new sources of customer, product, and operational value.

Now in Part 4, it’s time to bring home the bacon!

De-risk the Path Forward

Part 4 is where the rubber (and vendor commitment) hits the road.  Part 4 provides an opportunity for technology vendors to create a co-creation relationship with their customers to help them unleash the value of their technology investments.

The 3-stage customer engagement model depicted in Figure 2 de-risks the customer’s decision to move forward by putting a large majority of the onus of success on the technology vendor.  This approach provides the technology vendor with the opportunity to prove that they can deliver on the business potential of their customers’ technology investments (see Figure 2).

Figure 2: De-risk the Customer Path Forward

Stage 1: Envisioning Workshop builds off our earlier work in “connecting to value” and “envisioning value” to assure the customer that the solution and outcomes being proposed are relevant and meaningful vis-à-vis their business objectives.  Stage 1 is typically a low-cost, 2 to 3-week collaborative engagement with the key business stakeholders to identify, validate, value, and prioritize the use cases where data and analytics can deliver material business value.

Stage 2: Proof of Value (POV), or sometimes called the Minimal Viable Product (MVP), focuses on proving the business value of the prioritized use case identified in Stage 1. This 4 to 6-month co-creation engagement requires close collaboration between the vendor and the customer to quantify the realized business value while build confidence in the vendor’s ability to deliver said solution.  While many different technology tools will likely be used in Stage 2, the primary focus of Stage 2 is to prove the vendor’s ability to deliver quantifiable, relevant, and actionable business and operational outcomes.

Stage 3 builds upon the business and operational learnings from Stage 1 and reuses the technology assets co-created in Stage 2 to integrate the technology assets into the customer’s operational and management systems.  Having proven in Stage 2 that the vendor can deliver on the business potential, scaling (and governance becomes a key focus in Stage 3 as the underlying technology architecture and infrastructure facilitates the on-going delivery of business outcomes against the different customer use cases.

Establishing the Data Monetization Governance Board

Success breeds success, and after the successful execution of stage 3, more and more business units will also want the opportunity to unleash the business value of their technology investments to deliver meaningful, relevant and actionable business and operational outcomes.  To properly manage the sudden demand – and to avoid the data silos and orphaned analytics that doom the long-term economic value of data and analytics (yea, I wrote the book “The Economics of Data, Analytics, and Digital Transformation” that talks all about that), the customer will need help in establishing a Data Monetization Governance Board.

The Data Monetization Governance Board champions and enforces data and analytic monetization best practices across the organization. The Data Monetization Board has both the responsibility and the authority (the carrot and the stick) to enforce the sharing, reuse and continuous refinement of the organization’s data and analytic assets. Otherwise data monetization will continue to be a haphazard effort with disappointing results.

As I discuss in the blog “Digital Transformation Requires Redefining Role of Data Governance” the Data Monetization Governance Board must:

  • Evangelize a compelling vision to the business executives regarding the economic potential of data and analytic assets to power an organization’s digital transformation.
  • Educate senior executives, business stakeholders and strategic customers on how to “Think Like a Data Scientist” in identifying and envisioning where and how data and analytics can deliver quantifiable and relevant business value.
  • Apply Design Thinking and Value Engineering concepts in collaborating with business stakeholders to identify, validate, value and prioritize the organization’s high-value use cases that will drive the organization’s data and analytics development roadmap.
  • Charter the Data Science team to “engineer” reusable, continuously-learning and adapting analytic assets that support the organization’s high priority use cases.
  • Develop an analytics culture that synergizes the AI / ML model-to-human collaboration that empowers teams at the point of customer engagement and operational execution.

If data is the new oil – the catalyst for the economic growth in the 21st century – then the Data Monetization Governance Board may very well be the most important structure in the modern organization.

Unleashing the Business Value of Technology End Goal

Hopefully this 4-blog series can help technology vendors unleash the business value of their customers’ technology investments.  This outcomes-driven process starts by (1) connecting with the customers’ sources of value creation, then (2) help the customer to “envision the possibilities” in leveraging data and analytics to drive business outcomes.  And finally, (3) provide an iterative delivery model to de-risk the customer’s path forward (see Figure 3).

Figure 3: Unleashing the Business Value of Technology Roadmap

In the end, if the technology vendor can’t help customers unleash the business value of their technology investments, then they are in the wrong business (see Figure 4).

Figure 4: Unleashing the Business Value of Technology

It should be like printing money…for your customers…

Source Prolead brokers usa

the early move advantage in ai leaders vs laggards vs aspirants
The early move advantage in AI:  leaders vs laggards vs aspirants

Are you a leader or a laggard or an aspirant in AI?

This is a subject close to my heart because I focus my teaching / research and consulting towards the leader end or AI – where the competitive advantages create exponential gains for companies and people

There is a great article from Mc Kinsey called Tipping the Scale of AI: How leaders capture exponential returns which makes this point eloquently

Here are the key takeaways from this article that resonate with me

If you want to be an AI leader, you should be paying attention to this.

Also, if you are working for a company, you should try and see if they aspire to be a leader or an also ran.

I think in a decade, just like we are seeing with the retail industry, many of these also-rans who do not invest in AI for competitive advantage will not exist.  

  • Where many companies tire of marginal gains from early AI efforts, the most successful recognize that the real breakthroughs in AI learning and scale come from persisting through the arduous phases.
  • Key lessons from AI leaders: Fund aggressively when conditions for success are in place; Build density in domains; Bring a rounded set of skills and invest in productivity; Speed execution with iterative releases; Win the front line
  • Many organizations underestimate what it takes to sow true gains, be it selecting the right seeds, apportioning the right investment, or having a mindset willing to put up with the vagaries of the crop cycle.
  • But for those that persevere, the rewards can be huge.
  • McKinsey research finds that leading organizations that approach the AI journey in the right ways and stick with it through the tough patches generate three to four times higher returns from their investments.
  • These AI leaders get on a different performance trajectory from the outset because they understand that AI is about mastering the long haul.
  • They prepare for that journey by anticipating the types of things that will make it easier to navigate the ups and downs, such as feedback loops that allow data quality and user adoption to compound and AI investments to become self-boosting.
  • Where some companies tire of marginal gains from weeks of effort, leaders recognize that the real breakthroughs in AI learning and scale come from working through those small steps.
  • But only a small number of businesses (10%) have figured out how to make AI work in these ways. The rest remain mired in the low to middling stages of maturity, with laggards making up 60 percent of the population and aspirants 30 percent
  • Top performers recognize that most of the impact comes from the last 20% of the journey
  • Leaders get disproportional impact from their AI investments.
  • The window of opportunity for underperformers is
  • Rather than dabbling in lots of different areas, they build strength and density in one or two domains, then expand from there. That approach allows them to deepen their use and application of unstructured data, access more sophisticated use cases, and layer in the necessary operational underpinnings—the investment, talent, data management, production, and other techniques that allow AI-enabled practices to become embedded into everyday routines
  • Moreover, as leaders build domain strength and reach a certain threshold in AI performance, their rate of learning and productivity increases, allowing them to progress through other domains faster and tackle problems of ever-increasing difficulty.
  • They recognize that scaling AI solutions to deal with increasingly sophisticated problems is hard, but necessary to capture value. Teaching a machine to identify human faces is one milestone, for instance, but getting the machine to recognize particular faces and only those faces is a far more complex undertaking.
  • Once solved, companies gain compounding benefits quickly.

 

While some of this can be seen as consultants prodding companies to action , to many of us, none of this is new. We have already seen how companies like Amazon are reaping exponential gains due to their investment in technology and AI

There is an early mover advantage in AI and companies who aspire to take a leadership position will gain exponential benefits in comparison to the laggards and the aspirants

Image source pixabay ninikvaratskhelia

 

Source Prolead brokers usa

Pro Lead Brokers USA | Targeted Sales Leads | Pro Lead Brokers USA Skip to content