Analytics and Insights, Change Management, Corporate Culture, Data Science, machine intelligence

Things to consider when driving machine intelligence within a large organization

While modern day A.I. or machine intelligence (MI) hype revolves around big ”eureka” moments and broad scale “disruption”, expecting these events to occur regularly is unrealistic. The reality is that working with AI’s on simple routine tasks will drive better decision making systemically as people become comfortable with the questions and strategies that are now possible. And hopefully, it will also build the foundation for more of those eureka(!) moments. Regardless, technological instruments allow workers to process information at a faster rate while increasing the precision of understanding. Enabling more complex and accurate strategies. In short, it’s s now possible to do in hours what it once took weeks to do. Below are a few things I’ve found helpful to think about when drive machine intelligence at a large organization, as well as what is possible.

  • Algorithm Aversion — humans are more willing to accept the flawed human judgment. However, people are very judgmental if a machine makes a mistake – even within the lowest margin or error. Decisions generated by simple algorithms are often more accurate than those made by experts, even when the experts have access to more information than the formulas use. For further elaboration on making better predictions, the book Superforecasting is a must read.
  • Silos! The value of keeping your data/information a secret as a competitive edge does not outrun the value of potential innovation or insights if data is liberated within the broader organization. If this is possible build what I call diplomatic back channels where teams or analysts can sure data with each other.
  • Build a culture of capacity.  Managers are willing to spend 42 percent more on the outside competitor’s ideas. This is why consultants get hired,” Leigh Thompson, a professor of management and organizations at the Kellogg School says. “We bring in outside people to tell us something that we already know,” because it paradoxically means all the wannabe “winners” in the team can avoid losing face. It’s not a bad thing to seek external help, but if this is how most of your work is done and where you go to get new ideas you’ll have problems. As a residual,  the organization will fail to build strategic and technological muscle. And it’s likely to create a culture which emphasizes generalists, not novel technical thinkers in leadership roles. In turn, you end up with an environment where technology is appropriated at legacy processes & thinking. Not the other way around as it needs to be if you want to stay relevant. Avoid the temptation to outsource everything because nothing seems to be going anywhere right away. That 100-page power point deck from your consultant is only going to help in the most superficial of ways if you don’t have the infrastructure to drive the suggested outputs.
  • Organizations need to evolve their processes in step with technology. Not fit new technologies to old/outdated processes. If so “money” is left on the table that isn’t even considered or known due to lack of domain depth, which creates hierarchical bottlenecks that inhibit innovation and merit-based thinking. As a result, the talented people in your organization will defect, leaving mediocre skill sets to run general operations which standardize low common denominator outputs. As well as not maximize the prior investment in capabilities or assets which may have offered a competitive edge had they been seen through sufficiency.
Standard
Analytics and Insights, Brussels, Corporate Culture, Data Science, European Parliament, European Union, Global Politics, Insights, Linguistics, Lobbying, machine intelligence, Politics, Public Affairs and Communications, U.S. Politics

OSINT One. Experts Zero.

Our traditional institutions, leaders, and experts have shown to be incapable of understanding and accounting for the multi – dimensionality and connectivity of systems and events. The rise of the far-right parties in Europe. The disillusionment of European Parliament elections as evidenced by voter turnout in 2009 and 2014 (despite spending more money than ever), the Brexit and now the election of Donald Trump as president of the United States of America.  In short, there is little reason to trust experts without multiple data streams to contextualize and back up their hypothesis.

How could experts get it wrong? Frankly, it’s time to shift out of the conventional ways that we try to make sense of events in the political, market and business domain. The first variable is reimagining information from a  cognitive linguistic  standpoint. Probably the most neglected area in all of business and politics – at least within the mainstream. The basic idea?  Words have meaning. Meaning generates beliefs. And beliefs create outcomes, which in turn can be quantified. The explosion of mass media, followed by identity driven media, followed by social media, and alternative media. We are at the mercy of media systems that frame our reality. If you doubt this, reference the charts below. Google trends is deadly accurate in illustrating what is on people’s mind the most, bad or good, wins – at least when it comes to U.S. presidential elections. The saying bad press is good press is quantified here. As is George Lakoff’s thinking on framing and repetition (Google search trends can be used to easily see which frame is winning BTW ).

google-search-trends-presidential-candidates-2004-to-present

Google search trends of Democrat and Republican presidential candidates going back from 2004 to 2016. The candidate with the highest search volume won in all political races.

Social media and news mentions of key 2016 presidential candidates. The query used was (

Social media and news mentions of key 2016 presidential candidates per hour within the  query used “Donald Trump” OR “Hillary Clinton” OR hashtags #ElectionDay OR #Election2016.

google-search-trends-nicolas-sarkozy-franc%cc%a7ois-fillon-alain-juppe

Google trends wasn’t caught off guard by François Fillon’s win over Nicolas Sarkozy and Alain Juppé in the French Republican primaries . It was closer than the polls expected all along.

Within this system, there is little reason to challenge one’s beliefs and almost nothing forcing anyone to question their own. Institutions and old media systems used to be able to bottleneck this, they were the only one with a soap box and information was reasonably slow enough. To outthink current systems there is a need for a combination of sharper thinking, being able to quantify unorthodox data such as open source intelligence (OSINT) and creativity that traditional systems of measurement and strategy lack. Business, markets, and people strive, to a fault, for simple, linear and binary solutions or answers. Unfortunately, complex systems i.e. the world we live in doesn’t dashboard into nice simple charts like the one below.  The root causes of issues are ignored, untested, nor contextualized, which creates only superficial understanding on what affects business initiatives.

linear-dashboard

A nice linear BI chart above. Unfortunately, the world is much more complex and connected – like the network graph below, which clusters together new media that covered Hillary Clinton and Donald Trump .

network-graph-of-news-media-about-donald-trump-hillary-clinton

I know this may feel like a reach in terms of how all that is mentioned is connected so more on OSINT, data, framing, information, outcomes, and markets to come.

Cheers, Chandler

Standard
Analytics and Insights, Insights, machine intelligence, Uncategorized

The relationship between humans, machines and markets

Technology has increased access to information, which in turn has made the world more similar on a macro and sub-macro level. However, despite increased similarity, research shows business models are rarely horizontal, emphasizing the importance of micro-level strategic consideration. Companies routinely enter new markets relying on knowledge of how their industry works and the competencies that led to success in their home markets, while not being cognizant of granular details that can make the difference between success and failure in a new market. Only through machine driven intelligence can companies address the level of detail needed in a scalable and fast manner to remain competitive.

 

Research from Harvard Business Review reveal that despite an increase in information and globalization, business models are generally not horizontally scalable from one country to the next.

Research from Harvard Business Review show that despite more access information and networks through globalization, business models , 86% of the time, do not work from one country to the next.

 

Relying on simple explanations for complex phenomena is a risk - 3% of markets have a negative correlation to one another. Therefore organizations need to contextualize marketplace characteristics and avoid addressing only one variable at a time.

Relying on simple explanations for complex phenomena is a risk – 3% of markets have a negative correlation to one another. Therefore, organizations need to contextualize marketplace characteristics and avoid simplistic linear or binary based KPIs.

Pos Correlation HBR

The world is complex, but computational power is getting more robust and cheaper, while machines are getting smarter. Despite that only 11% of country-to-country profitability had a positive relationship, asset rich large organizations with large networks and infrastructure are uniquely positioned to exploit this if they learn to understand their assets in higher resolution. In most cases this means letting go of long held beliefs and how things used to be done, in addition to being agnostic how to make money.

Amount of shares traded on the NYSE. This data, can be used as an analog to show how quickly information and connectivity is growing exponentially.

The amount of shares traded on the NYSE is shown as an analog to illustrate how quickly information is growing exponentially every year. Despite advancement in communication technology, companies are no better at market access.

Furthermore, machine intelligence and information has led to the rapidly diminishing value of expertise, in addition to eroding the value of information. The level of expertise needed to out-run or beat machine intelligence has exponentially increased every year. Over the next one to two years the most successful companies will come to accept the burden of proof has switched from technologies and A.I. to human expertise. Furthermore, machines will come to reframe what business and strategy means. Business expertise in the future will be the ability to synthesize and explore data sets and create options using augmented intelligence – not being an expert on a subject per se. The game changers will be those that have the fastest “information to action” at scale.

A residual of that characteristic makes a “good” or “ok” decision’s value exponentially highest in the beginning – and often times much more valuable than a perfect decision. To address this trend, organizations will need to focus on developing process and internal communication that foster faster “information-to-action” opportunity cost transaction times, similar to how traders look at financial markets. Those margins of competitive edge will continue to shrink, but will become exponentially more valuable.

Speed to market and competitive advantage

How are businesses harnessing AI and other technologies to lead the way ?

Studies show experts consistently fail at forecasting and traditionally perform worse than random guessing in businesses as diverse as medicine, real-estate valuation, and political elections. This is because traditionally people weight experiences and information in very biased ways. In the knowledge economy this is detrimental to strategy and business decisions.

Working with machines enables businesses to learn and quantify connections and influence in a way humans cannot. Rarely is an issue isolated to the confines of a specific domain, and part of Walmart’s analytics strategy is to focus on key variables in the context of other variables that are connected. This can be done in extremely high resolution by taking a machine based approach to mine disparate data sets, which ultimately allows for flexibility and higher resolution KPIs to make business decisions with.

What are the effects of digital disintermediation and the sharing economy on productivity growth?

Machines have increased humans ability to synthesize multiple information streams simultaneously, while connecting communication, this could lead to a higher utility on assets. It’s likely that businesses in the future will have to be more focused on opportunity cost and re-imagine asset allocation with increased competition due to lower barriers on entry. Inherently intelligence and insights is about decisions.  A residual of that characteristic makes a good or ok decision’s value exponentially highest in the beginning – and often times more valuable than a perfect decision. To address this trend, organizations need to focus on developing process and internal communication that foster faster “information-to-action” transaction times, much like how traders look at financial markets.

Is this the beginning of the end?

Frameworks driven by machines will allow humans to focus on more meaningful and creative strategies that cut through noise to find what variables that can actually be controlled, mitigating superficial processes and problems. As a result, it is the end for people and companies that rely on information and routine for work. And the beginning for those that can solve abstract problems with creative and unorthodox thinking within tight margins. Those that do so will also be able to scale those skills globally with advancements in communication technology and the sharing economy, which will speed up liquidity  on hard and knowledge-based assets considerably.

 

Standard
Analytics and Insights, Change Management, Corporate Culture, Data Science, Insights, machine intelligence

Next Generation KPIs

Recently I’ve been thinking of ways to detect bias as well as look into what makes people share. Yes understanding dynamics and trends over time, like the chart below (Topsy is a great simple, free easy tool to get basic Twitter trends), can be helpful – especially with linear forecasting. None the less they reach their limits when we want to look for deeper meaning  – say at the cognitive or “information flow” level.  

MN Vikings Teddy Bridgewater Adrian Peters Twitter mentions

Enter networks.  The advantages in understanding at a much deeper level are not possible to do with standard KPIs like volume, publish count and sentiment over time. Through mapping out the network based on entities, extracted locations and similar text and language characteristics it’s possible to map coordinates of how a  headlines, entities or article exists and connect to other entities within the specific domain. In turn, this creates an analog of the physical world with stunning accuracy –  since more information is reported online every day. For example,  using to online news articles and Bit.ly link data, I found articles with less centrality (based on the linguistic similarity of the aggregated on-topic news article) to their domain, which denote variables being left out (of the article), typically got shared the most on social channels. In short, articles that were narrower in focus, and therefore less representative of the broader domain, tended to be shared… This is just the tip of the iceberg. 

centrality, bias and social sharing - @chandlertwilson

Standard