While modern day A.I. or machine intelligence (MI) hype revolves around big ”eureka” moments and broad scale “disruption”, expecting these events to occur regularly or a scalable manner is unrealistic. The reality is that working with AI’s on simple or routine tasks will drive better decision making systemically as people become comfortable with the questions and strategies that are now possible. And hopefully, it will also build the foundation for those eureka(!) moments downstream. Regardless, technological instruments allow workers to process information at a faster rate while increasing the precision of understanding. Enabling more complex and accurate strategies. In short, it’s s now possible to do in hours what it once took weeks to do. Below are a few things I’ve found helpful to think about as one tries to drive machine intelligence in a large organization.
- Algorithm Aversion — humans are more willing to accept flawed human judgment. However, people are very judgmental if a machine makes a mistake – even within the lowest margin or error. For example, only 21% of managers who implement strategy actually test how KPI’s actually link back to organizational performance, and many of those tested found their early assumptions flawed.Companies in the top third of data-driven decision making are 5% more productive and 6% more profitable. Decisions generated by simple algorithms are often more accurate than those made by experts, even when the experts have access to more information than the formulas use.
- Silos! The value of keeping your data/information a secret as a competitive edge does not outrun the value of potential innovation or insights if data is liberated within the broader organization. If this is possible build what I call diplomatic back channels where teams or analysts can sure data with each other.
- To find opportunity or address problems the organization will need to quickly move from linear analytics or business intelligence to weighting/graphing of disparate data sets with artificial intelligence. Machines are far less biased than people and better at weighting the relationships between desperate events, objects, and data sets. Even simple statistics have been shown to outperform even the most experienced analyst.
- Develop a culture of skill and capacity. The majority of abstract, complex and pressing work is typically outsourced, not built internally. Managers are willing to spend 42 percent more on the outside competitor’s ideas. “This is why consultants get hired,” Leigh Thompson, a professor of management and organizations at the Kellogg School says. “We bring in outside people to tell us something that we already know,” because it paradoxically means all the wannabe “winners” in the team can avoid losing face. It’s not so much that it is a bad thing to seek external help, but in reality, this translates into a lack of strategic capacity internally, and as a residual, the organization will lose the opportunity to build those thinking and technological muscles. In short, avoid the temptation to outsource everything because nothing seems to be going anywhere right away. That 100-page power point deck the consultant you hired gave you isn’t going to help if you don’t have the infrastructure to drive the suggested outputs.
- Those that generally push for everything to be outsourced probably do so to limit exposure of their lack of skills & vision. In short be skeptical of managers who always want to take this approach opposed to building a solid foundation in MI. This is the derivative of a strong management culture which emphasizes generalists, not novel, unorthodox or technical thinkers in leadership roles. Research shows general managerial styles are less successful in technical or data-driven tasks than their counterpart (and most work is data or technology related these days). Further, they also foster a culture where exercising intellectual integrity or novelty on subject matters are a liability for one’s career. In turn, you end up in an environment where technology is appropriated at legacy processes & thinking i.e. not as efficient nor effective.
- Organizations need to evolve their processes in step with technology. Not the other way around. If so “money” is left on the table that isn’t even considered or known due to lack of domain depth, which creates hierarchical bottlenecks that inhibit innovation and merit-based thinking. As a result, the talented people in your organization will defect, leaving mediocre skill sets to run general operations which standardize low common denominator outputs. As well as not maximize the prior investment in capabilities or assets which may have offered a competitive edge had they been seen through sufficiency.
- Be more introspective and strategic with assets and cash. By integrating sensors (Internet of things), more sophisticated tracking and the understanding and integration of physical and cash assets, organizations have the opportunity to maximize the utility of its assets. In turn, this would foster a “trading desk mentality” where companies would operate more like a hedge fund on its own cash flow and purchases, as well as being mindful of foreign exchange rates & commodities markets. This approach could also take advantage of assets such as shelf space and storage – much like AirBNB did with extra home space. Generally, over the last 10 years, companies with multiple revenue streams and consumer offerings such as Apple and Amazon have weathered the storm better than companies whose focus remains narrow.
- Create a data markets or indices that your suppliers/vendors can opt in. In doing so it helps compress prices by sharing the burden of costs associated with market research and or supply chains. Furthermore, how to create value from data quickly will perhaps be the more pressing issue for large organizations. This will inherently drive the need for all of these organizations to be more strategic with their data and technology, not like a commodity to be hoarded. Remember data is only valuable if acted on.
- Create a social graph of your organization to understand how information becomes action. By leveraging machines to analyze internal information and asset flows & allocation within a graph-based approach, are able to systemize and automate many routine processes such as meetings or approval processes that humans currently do. Furthermore, intelligence agents analyzing the graph can bring together people working on disparate projects.
Our traditional institutions, leaders, and experts have shown to be incapable of understanding and accounting for the multi – dimensionality and connectivity of systems and events. The rise of the far-right parties in Europe. The disillusionment of European Parliament elections as evidenced by voter turnout in 2009 and 2014 (despite spending more money than ever), the Brexit and now the election of Donald Trump as president of the United States of America. In short, there is little reason to trust experts without multiple data streams to contextualize and back up their hypothesis.
How could experts get it wrong? Frankly, it’s time to shift out of the conventional ways that we try to make sense of events in the political, market and business domain. The first variable is reimagining information from a cognitive linguistic standpoint. Probably the most neglected area in all of business and politics – at least within the mainstream. The basic idea? Words have meaning. Meaning generates beliefs. And beliefs create outcomes, which in turn can be quantified. The explosion of mass media, followed by identity driven media, followed by social media, and alternative media. We are at the mercy of media systems that frame our reality. If you doubt this, reference the charts below. Google trends is deadly accurate in illustrating what is on people’s mind the most, bad or good, wins – at least when it comes to U.S. presidential elections. The saying bad press is good press is quantified here. As is George Lakoff’s thinking on framing and repetition (Google search trends can be used to easily see which frame is winning BTW ).
Within this system, there is little reason to challenge one’s beliefs and almost nothing forcing anyone to question their own. Institutions and old media systems used to be able to bottleneck this, they were the only one with a soap box and information was reasonably slow enough. To outthink current systems there is a need for a combination of sharper thinking, being able to quantify unorthodox data such as open source intelligence (OSINT) and creativity that traditional systems of measurement and strategy lack. Business, markets, and people strive, to a fault, for simple, linear and binary solutions or answers. Unfortunately, complexity doesn’t dashboard into nice charts like the one below. The root causes of issues are ignored due to a lack of context, which in turns create a large window for false assumptions on what affects business initiatives. These are approaches are for the most part only helpful at the superficial level.
I know this may feel like a reach in terms of how all that is mentioned is connected so more on OSINT, data, framing, information, outcomes, and markets to come.
While investors were in shock, open source signals such as Google trends pointed to “Leave” being predominate the majority of the time, illustrating expert and market biases. Perhaps they should work on how to integrate these unconventional data streams better (sorry couldn’t help it). The UK’s decision to exit from the EU is part of a larger global phenomena that could have been understood better with open source, not just market, data.
The world is growing more complex. Information is moving faster. Humans were not evolved to retain or understand this mass output of (dis)information in any logical way. As a response, a retreat to simple explanations and self-censorship towards new ideas, that might challenge one’s frame, are ignored and become the norm. Populist decisions are made and embraced, often times reactionary towards the establishment or elite. Multi-national corporations and elites will need to step outside of their bubble and take note of nationalist, albeit sometimes isolationist, such as Donald Trump, Bernie Sanders, President Erdogan of Turkey, Marie Le Pen’s Front National of France, Boris Johnson – Former Mayor of London and Brexit backer (good chance he take David Cameron’s place), Germany’s AFD and the 5 star movement in Italy gain in both popularity and power.
In addition to the more media coverage, people were associated more with the leave campaign, which is an advantage. During a political campaign choices and policy lines are anything but logical, they tend to fall on emotional lines, so it’s important that institutional communications have a noticeable figurehead, especially in the age of media. It says something when the top people that are associated with remain are Barak Obama, Janet Yellen and Christine Lagarde. Note that David Cameron is more central with leave. None the less the pleas by political outsiders and institutions such as the IMF and World Bank for the UK to remain in the EU, potentially caused damage to the “Remain” campaign. UK voters seemed to not want to hear from foreign political elites on the matter. This is illustrated by the connection and proximity of the “Obama Red Cluster” to the French right wing Forest Green cluster (and the results) below. The “Brexit” could lend credence to the possibility of EU exit contagion. There are very real forces in France (led by the Front National) and Italy (led by the 5 Star movement (who just won big in elections) that are driving hard for succession from the EU and or potentially the Eurozone.
- Seeking shelter from volatility, banks (especially European ones) are fleeing to the gold market. While this is to be expected, dividend based stock, as well as oil, would be attractive to those seeking stability as well.
- Thursday’s referendum sent global markets into turmoil. The pound plunged by a record and the euro slid by the most since it was introduced in 1999. Historically, the British Pound reached an all-time high of 2.86 in December of 1957 and a record low of 1.05 in February of 1985.
- Don’t count on US interest rate hikes. Yellen has expressed concern for global volatility on multiple occasions. The Brexit just added to that. The Bank of England could follow the US Fed and drop interest rates on the GBP to account for market uncertainty.
- If aggressive, European uncertainty could be an opportunity for US companies to gain on European competitors. Due to the somber mood within Europe, companies could either be more conservative with investment, leaving them vulnerable.
- Alternatively, the Brexit may trigger more aggressive U.S. or global expansion by European Companies while Brexit ramifications are further understood.
- The political takeaway is the remain campaign was relatively sterile, having no figurehead or clear policy issues directly relating back to the EU. This was reflected by the diversity in associated search terms related to the “Leave” campaign, in addition to Angela Merkel, not an EU leader such as European Commission President Jean-Claude Junker, once again as being seen as the defacto voice of Europe.
Recently I’ve been thinking of ways to detect bias as well as look into what makes people share. Yes understanding dynamics and trends over time, like the chart below (Topsy is a great simple, free easy tool to get basic Twitter trends), can be helpful – especially with forecasting. None the less they reach their limits when we want to look for deeper meaning – say at the cognitive or “information flow” level.
Enter coordinates and networks. The advantages are not possible to do with a tradition KPI’s like volume over time. Through mapping out the a network based on entities, extracted locations and similar text and language characteristics it’s possible to map coordinates of how a headlines, entities or article exists in space within the specific domain. And this happens to deadly representative of the physical world.. especially since more information is reported online everyday.
When put together (shown below), I found a way to detect bias. Using to online news articles and Bit.ly link data, I found articles with less centrality to their domain, which denote variables being left out (of the article), typically got shared the most on social channels. In short some bias = more sharing… Interesting – although more research is needed.
For this post, I decided to remain old school and mainly rely on search data. It’s pretty basic, but typically offers a great view of what people are interested in. Google’s market share is around 90% in Europe and it’s the most visited site in the world. In my opinion, Google Trends is the largest focus group in the world.
First I looked at the overall interest in Germany between Angela Merkel and Peer Steinbrück, as well as their political affiliations – the CDU and PSD. Initially, I was curious as to how party identity interest compared to interest in the politician. To anchor this chart I did the same with US presidential campaign (the chart below). I have a hunch, and the data seems to be telling me thus far, that the more media-oriented politics becomes (along with everything else in the world), the more important celebrity, authenticity, and individuality becomes. Take a look at this recent brand analysis done by Forbes. Chris Christie wins, having the highest approval rating of over 3,500 “brands” according to BAV (awesome company) at 78%. For those that don’t know, Christie is probably the most straight forward tell-it-like-it-is politician in the country.
So what can we learn from the Google search interest shown below?
- Politics is still about sheer volume and name recognition. For those that think being novel and unique achieves victory over blasting away nonstop in a strategically framed and coordinated way, think again. People tune out if they aren’t interested. Irrelevance is almost always worse than bad PR or sentiment (excluding a case like Anthony Weiner). You simply don’t win if you don’t interest people. If people aren’t talking about you, you’re not interesting. Merkel had more search interest than Steinbrück and over the course of the year probably got 10,000 times more airtime, both good and bad, due to her large role in the euro crisis. In short, repetition is king.
- Framing and consistent language strategy is vital. Volume can be shown to equate with recognition of a person, but this can easily enough be analogized to a policy or issue. Give me a choice between a clever social media strategy or consistent language strategy, meaning all the key issues are repeated by the party and coordinated as much as possible, and I’ll take the language strategy any day. It’s amazing how just being consistent in political communications is overlooked by companies and political leaders in Europe. Social media tends to be a framing conduit, not the reason people mobilize or have opinions.
- The world is growing ever more connected. Look at how global the reporting of the German election was. Obviously, its importance was higher due to Germany’s rising influence, but none the less the amount of sources from all over the world is impressive. A note for the upcoming EU elections: don’t forget to target the USA and other regions to influence specific regions in Europe. A German constituent might read about a policy from the Financial Times, a Frenchman the Wall Street Journal or an American based in Brussels, who knows Europeans who can vote, Bloomberg.
I decided to throw in Twitter market share of the candidates from August 21st to September 21st, the day prior to elections. I found it interesting to see how closely Belgium and the United State reflect Germany, probably because these countries are looking at the elections from more of a spectator view. Meanwhile, southern Europe, which had a vested interest in the election, was pretty much aligned. France, Spain, and Italy seem to report a bit more, and in a similar way, on Merkel – probably due to sharing the same media sources. Unfortunately, I don’t have the time to look into this pattern too much at the moment, but it’s something I’ll continue to think about in the future.
TTIP/TAFTA is a true game changer for both the EU and US in terms of economic value, especially in a time of crisis for the EU. To find out exactly what content people consumed and analyze policy trends, we mined the web (big data). At the moment TTIP/TAFTA is not being met without issues – as we all know in Brussels – #NSAGate, data privacy and IP are slowing down negotiations (we’re looking at you France), and this is generally what the data had to say as well.
Over view: 5,505 mentions of TTIP/TAFTA in the last 100 days – too large of number for business and institutions to ignore. In short you need to join the conversation if you have something to say about it ASAP (indecision is a decision).
The biggest uptick – a total of 300 mentions – came when Obama spoke at the G-8 summit in Ireland on June 17th when trade talks began. The key theme at this time was the potential boost in the economy. The official press release is here.
“The London-based Centre for Economic Policy Research estimates a pact – to be known as the Transatlantic Trade and Investment Partnership – could boost the EU economy by 119 billion euros (101.2 billion pounds) a year, and the U.S. economy by 95 billion euros.However, a report commissioned by Germany’s non-profit Bertelsmann Foundation and published on Monday, said the United States may benefit more than Europe. A deal could increase GDP per capita in the United States by 13 percent over the long term but by only 5 percent on average for the European Union, the study found.”
Given that there is conflicting information we wanted to see whose idea and data wins out – the Centre for Economic Policy Research (CEPR) or Bertelsmann Foundation (BF)? To do this we looked to see which study was referenced most. The chart below shows the mentions of each organization within the TTIP/TAFTA conversation over the last 100 days. The Center for Economic Policy Research is in orange and the Bertelsmann Foundation is in green.
In total both studies were cited almost the same amount:
- Centre for Economic Policy Research: 80 Mentions
- Bertelsmann Foundation: 83 mentions
- Both organizations were mentioned together 53 Times.
More recently though the trend seems to show that the Economic Policy Research is being cited most in the last 30 days, including a large uptick on July 8th. This is mainly due to the market share of the sources being located more in the US and the US wanting to get a deal done faster than the more hesitant Europeans. Keep in mind the CEP claims larger benefits of TAFTA/TTIP than the BF study.
Where are the mentions?
- The US had 2,743 mentions (49% overall)
- All of Europe combined total was 1,986 (36% overall)
Of the topics ACTA is still being talked about with, IP and Data Protection top the list. This is not surprising given France’s reluctance to be agreeable because of the former and #prism, so below are those themes plotted.
The top stories on Twitter are in the table below. It’s not surprising that the White House is number one, but where are the EU institutions and media on this?
|Top Stories||Tweets||Retweets||All Tweets||Impressions|
Everybody knows the battle for hearts and minds of people starts with a good acronym so I broke down the market share between TTIP (165) and TAFTA (2197):
I may add more in the coming days but those are a few simple bits of info for now. Nonetheless if you want to join the conversation on Twitter the top hashtags are below.