Analytics and Insights, Change Management, Corporate Culture, Data Science, machine intelligence

Driving Machine Intelligence within an Organization

While modern day A.I. or machine intelligence (MI) hype revolves around big ”eureka” moments and broad scale “disruption”, expecting these events to occur regularly or a scalable manner is unrealistic. The reality is that working with AI’s on simple or routine tasks will drive better decision making systemically as people become comfortable with the questions and strategies that are now possible. And hopefully, it will also build the foundation for those eureka(!) moments downstream. Regardless, technological instruments allow workers to process information at a faster rate while increasing the precision of understanding. Enabling more complex and accurate strategies. In short, it’s s now possible to do in hours what it once took weeks to do. Below are a few things I’ve found helpful to think about as one tries to drive machine intelligence in a large organization.

  • Algorithm Aversion — humans are more willing to accept flawed human judgment. However, people are very judgmental if a machine makes a mistake – even within the lowest margin or error. For example, only 21% of managers who implement strategy actually test how KPI’s actually link back to organizational performance, and many of those tested found their early assumptions flawed.Companies in the top third of data-driven decision making are 5% more productive and 6% more profitable. Decisions generated by simple algorithms are often more accurate than those made by experts, even when the experts have access to more information than the formulas use.
  • Silos! The value of keeping your data/information a secret as a competitive edge does not outrun the value of potential innovation or insights if data is liberated within the broader organization. If this is possible build what I call diplomatic back channels where teams or analysts can sure data with each other.
  • To find opportunity or address problems the organization will need to quickly move from linear analytics or business intelligence to weighting/graphing of disparate data sets with artificial intelligence. Machines are far less biased than people and better at weighting the relationships between desperate events, objects, and data sets. Even simple statistics have been shown to outperform even the most experienced analyst.
  • Develop a culture of skill and capacity. The majority of abstract, complex and pressing work is typically outsourced, not built internally.  Managers are willing to spend 42 percent more on the outside competitor’s ideas. “This is why consultants get hired,” Leigh Thompson, a professor of management and organizations at the Kellogg School says. “We bring in outside people to tell us something that we already know,” because it paradoxically means all the wannabe “winners” in the team can avoid losing face. It’s not so much that it is a bad thing to seek external help, but in reality, this translates into a lack of strategic capacity internally, and as a residual,  the organization will lose the opportunity to build those thinking and technological muscles. In short, avoid the temptation to outsource everything because nothing seems to be going anywhere right away. That 100-page power point deck the consultant you hired gave you isn’t going to help if you don’t have the infrastructure to drive the suggested outputs.
  • Those that generally push for everything to be outsourced probably do so to limit exposure of their lack of skills & vision. In short be skeptical of managers who always want to take this approach opposed to building a solid foundation in MI. This is the derivative of a strong management culture which emphasizes generalists, not novel, unorthodox or technical thinkers in leadership roles. Research shows general managerial styles are less successful in technical or data-driven tasks than their counterpart (and most work is data or technology related these days). Further, they also foster a culture where exercising intellectual integrity or novelty on subject matters are a liability for one’s career. In turn, you end up in an environment where technology is appropriated at legacy processes & thinking i.e. not as efficient nor effective.
  • Organizations need to evolve their processes in step with technology. Not the other way around. If so “money” is left on the table that isn’t even considered or known due to lack of domain depth, which creates hierarchical bottlenecks that inhibit innovation and merit-based thinking. As a result, the talented people in your organization will defect, leaving mediocre skill sets to run general operations which standardize low common denominator outputs. As well as not maximize the prior investment in capabilities or assets which may have offered a competitive edge had they been seen through sufficiency.
  • Be more introspective and strategic with assets and cash. By integrating sensors (Internet of things), more sophisticated tracking and the understanding and integration of physical and cash assets, organizations have the opportunity to maximize the utility of its assets. In turn, this would foster a “trading desk mentality” where companies would operate more like a hedge fund on its own cash flow and purchases, as well as being mindful of foreign exchange rates & commodities markets. This approach could also take advantage of assets such as shelf space and storage – much like AirBNB did with extra home space. Generally, over the last 10 years, companies with multiple revenue streams and consumer offerings such as Apple and Amazon have weathered the storm better than companies whose focus remains narrow.
  • Create a data markets or indices that your suppliers/vendors can opt in.  In doing so it helps compress prices by sharing the burden of costs associated with market research and or supply chains. Furthermore, how to create value from data quickly will perhaps be the more pressing issue for large organizations. This will inherently drive the need for all of these organizations to be more strategic with their data and technology, not like a commodity to be hoarded. Remember data is only valuable if acted on.
  • Create a social graph of your organization to understand how information becomes action. By leveraging machines to analyze internal information and asset flows & allocation within a graph-based approach, are able to systemize and automate many routine processes such as meetings or approval processes that humans currently do. Furthermore, intelligence agents analyzing the graph can bring together people working on disparate projects.
Standard
Analytics and Insights, Data Science, Insights, Linguistics, Lobbying, machine intelligence, Politics, Public Affairs and Communications, U.S. Politics

OSINT one. Experts zero.

Our traditional institutions, leaders, and experts have shown to be incapable of understanding and accounting for the multi – dimensionality and connectivity of systems and events. The rise of the far-right parties in Europe. The disillusionment of European Parliament elections as evidenced by voter turnout in 2009 and 2014 (despite spending more money than ever), the Brexit and now the election of Donald Trump as president of the United States of America.  In short, there is little reason to trust experts without multiple data streams to contextualize and back up their hypothesis.

How could experts get it wrong? Frankly, it’s time to shift out of the conventional ways that we try to make sense of events in the political, market and business domain. The first variable is reimagining information from a  cognitive linguistic  standpoint. Probably the most neglected area in all of business and politics – at least within the mainstream. The basic idea?  Words have meaning. Meaning generates beliefs. And beliefs create outcomes, which in turn can be quantified. The explosion of mass media, followed by identity driven media, followed by social media, and alternative media. We are at the mercy of media systems that frame our reality. If you doubt this, reference the charts below. Google trends is deadly accurate in illustrating what is on people’s mind the most, bad or good, wins – at least when it comes to U.S. presidential elections. The saying bad press is good press is quantified here. As is George Lakoff’s thinking on framing and repetition (Google search trends can be used to easily see which frame is winning BTW ).

google-search-trends-presidential-candidates-2004-to-present

Google search trends of Democrat and Republican presidential candidates going back from 2004 to 2016. The candidate with the highest search volume won in all political races.

Social media and news mentions of key 2016 presidential candidates. The query used was (

Social media and news mentions of key 2016 presidential candidates per hour within the  query used “Donald Trump” OR “Hillary Clinton” OR hashtags #ElectionDay OR #Election2016.

google-search-trends-nicolas-sarkozy-franc%cc%a7ois-fillon-alain-juppe

Google trends wasn’t caught off guard by François Fillon’s win over Nicolas Sarkozy and Alain Juppé in the French Republican primaries . It was closer than the polls expected all along.

Within this system, there is little reason to challenge one’s beliefs and almost nothing forcing anyone to question their own. Institutions and old media systems used to be able to bottleneck this, they were the only one with a soap box and information was reasonably slow enough. To outthink current systems there is a need for a combination of sharper thinking, being able to quantify unorthodox data such as open source intelligence (OSINT) and creativity that traditional systems of measurement and strategy  lack. Business, markets, and people strive, to a fault, for simple, linear and binary solutions or answers. Unfortunately, complexity doesn’t dashboard into nice charts like the one below.  The root causes of issues are ignored due to a lack of context, which in turns create a large window for false assumptions on what affects business initiatives. These are approaches are for the most part only helpful at the superficial level.

linear-dashboard

A nice linear BI chart above. Unfortunately, the world is much more complex and connected – like the network graph below, which clusters together new media that covered Hillary Clinton and Donald Trump .

network-graph-of-news-media-about-donald-trump-hillary-clinton

I know this may feel like a reach in terms of how all that is mentioned is connected so more on OSINT, data, framing, information, outcomes, and markets to come.

Cheers, Chandler

Standard
Analytics and Insights, Insights, machine intelligence, Uncategorized

The relationship between humans, machines and markets

Technology has increased access to information, which in turn has made the world more similar on a macro and sub-macro level. However, despite increased similarity, research shows business models are rarely horizontal, emphasizing the importance of micro-level strategic consideration. Companies routinely enter new markets relying on knowledge of how their industry works and the competencies that led to success in their home markets, while not being cognizant of granular details that can make the difference between success and failure in a new market. Only through machine driven intelligence can companies address the level of detail needed in a scalable and fast manner to remain competitive.

 

Research from Harvard Business Review reveal that despite an increase in information and globalization, business models are generally not horizontally scalable from one country to the next.

Research from Harvard Business Review show that despite more access information and networks through globalization, business models , 86% of the time, do not work from one country to the next.

 

Relying on simple explanations for complex phenomena is a risk - 3% of markets have a negative correlation to one another. Therefore organizations need to contextualize marketplace characteristics and avoid addressing only one variable at a time.

Relying on simple explanations for complex phenomena is a risk – 3% of markets have a negative correlation to one another. Therefore, organizations need to contextualize marketplace characteristics and avoid simplistic linear or binary based KPIs.

Pos Correlation HBR

The world is complex, but computational power is getting more robust and cheaper, while machines are getting smarter. Despite that only 11% of country-to-country profitability had a positive relationship, asset rich large organizations with large networks and infrastructure are uniquely positioned to exploit this if they learn to understand their assets in higher resolution. In most cases this means letting go of long held beliefs and how things used to be done, in addition to being agnostic how to make money.

Amount of shares traded on the NYSE. This data, can be used as an analog to show how quickly information and connectivity is growing exponentially.

The amount of shares traded on the NYSE is shown as an analog to illustrate how quickly information is growing exponentially every year. Despite advancement in communication technology, companies are no better at market access.

Furthermore, machine intelligence and information has led to the rapidly diminishing value of expertise, in addition to eroding the value of information. The level of expertise needed to out-run or beat machine intelligence has exponentially increased every year. Over the next one to two years the most successful companies will come to accept the burden of proof has switched from technologies and A.I. to human expertise. Furthermore, machines will come to reframe what business and strategy means. Business expertise in the future will be the ability to synthesize and explore data sets and create options using augmented intelligence – not being an expert on a subject per se. The game changers will be those that have the fastest “information to action” at scale.

A residual of that characteristic makes a “good” or “ok” decision’s value exponentially highest in the beginning – and often times much more valuable than a perfect decision. To address this trend, organizations will need to focus on developing process and internal communication that foster faster “information-to-action” opportunity cost transaction times, similar to how traders look at financial markets. Those margins of competitive edge will continue to shrink, but will become exponentially more valuable.

Speed to market and competitive advantage

How are businesses harnessing AI and other technologies to lead the way ?

Studies show experts consistently fail at forecasting and traditionally perform worse than random guessing in businesses as diverse as medicine, real-estate valuation, and political elections. This is because traditionally people weight experiences and information in very biased ways. In the knowledge economy this is detrimental to strategy and business decisions.

Working with machines enables businesses to learn and quantify connections and influence in a way humans cannot. Rarely is an issue isolated to the confines of a specific domain, and part of Walmart’s analytics strategy is to focus on key variables in the context of other variables that are connected. This can be done in extremely high resolution by taking a machine based approach to mine disparate data sets, which ultimately allows for flexibility and higher resolution KPIs to make business decisions with.

What are the effects of digital disintermediation and the sharing economy on productivity growth?

Machines have increased humans ability to synthesize multiple information streams simultaneously, while connecting communication, this could lead to a higher utility on assets. It’s likely that businesses in the future will have to be more focused on opportunity cost and re-imagine asset allocation with increased competition due to lower barriers on entry. Inherently intelligence and insights is about decisions.  A residual of that characteristic makes a good or ok decision’s value exponentially highest in the beginning – and often times more valuable than a perfect decision. To address this trend, organizations need to focus on developing process and internal communication that foster faster “information-to-action” transaction times, much like how traders look at financial markets.

Is this the beginning of the end?

Frameworks driven by machines will allow humans to focus on more meaningful and creative strategies that cut through noise to find what variables that can actually be controlled, mitigating superficial processes and problems. As a result, it is the end for people and companies that rely on information and routine for work. And the beginning for those that can solve abstract problems with creative and unorthodox thinking within tight margins. Those that do so will also be able to scale those skills globally with advancements in communication technology and the sharing economy, which will speed up liquidity  on hard and knowledge-based assets considerably.

 

Standard
Analytics and Insights, Change Management, Corporate Culture, Data Science, Insights, machine intelligence

Next generation KPI’s

Recently I’ve been thinking of ways to detect bias as well as look into what makes people share. Yes understanding dynamics and trends over time, like the chart below (Topsy is a great simple, free easy tool to get basic Twitter trends), can be helpful – especially with forecasting. None the less they reach their limits when we want to look for deeper meaning  – say at the cognitive or “information flow” level.  

MN Vikings Teddy Bridgewater Adrian Peters Twitter mentions

Enter coordinates and networks.  The advantages are not possible to do with a tradition KPI’s like volume over time. Through mapping out the a network based on entities, extracted locations and similar text and language characteristics it’s possible to map coordinates of how a  headlines, entities or article exists in space within the specific domain. And this happens to deadly representative of the physical world.. especially since more information is reported online everyday.

When put together (shown below), I found a way to detect bias. Using to online news articles and Bit.ly link data, I found articles with less centrality to their domain, which denote variables being left out (of the article), typically got shared the most on social channels. In short some bias = more sharing… Interesting – although more research is needed.

centrality, bias and social sharing - @chandlertwilson

Standard