The Great Tech Convergence

What if I told you that you can have all the efficiency, speed, streamlined operations, accurate forecasting, smart insights, better opportunities, and enhanced revenue, at your fingertips – with the right technology by your side?

Make that “the right technologies”. In the plural. Therein lies the answer. In the plural. It’s worth reiterating, because this simple insight is easy to miss. The great tech convergence is the natural coming together of three maturing technologies: the Internet of Things (IoT), Artificial Intelligence (AI), and Augmented and Virtual Reality (AR/VR), that have, thus far, traversed independent trajectories. And when they come together, the whole is far bigger than the mere sum of its parts.

Different technologies have different capabilities. And limitations. Automation can help you speed up workflows, but it can’t tell you what a customer wants. Sensors keep a watchful finger on the pulse of your machinery, but can’t actually predict a breakdown. Altered reality is excellent assistive technology but it can’t make your decisions for you.

If you run an enterprise or organization of any sort, you’ve probably had to choose one over the other. You’ve likely had to accept that trade-off as an inevitable part of the deal.

But why settle for a slice of the pie when you can have it all?

Blame it on risk-averse management, a financial crunch, poor technical understanding or lack of clarity as to what you want your technology to do for your business. The fact remains that it has been the norm for enterprises to take a cautiously incremental approach towards all things technology. Caution is a good thing, especially when it comes to the newest shiny piece of technology. But now that we’ve crossed that early adoption threshold, hit maturity, and seen irrefutable evidence of what Artificial Intelligence, Internet of Things and Augmented Reality can do for us, there’s no room for doubt any longer. It’s time to go big and go bold!

By that I don’t mean spending more or merely acquiring the latest technology for its own sake. What I’m talking about is ditching the piecemeal approach to focus on the big picture. It’s been said before and I’ll say it again…

The silos must go!

The current practice is to define a specific use case and then plug in a piece of technology to fulfill that requirement. Here’s the problem with this method. When you automate one isolated process here and upgrade another one there, you reap benefits in a fragmented manner. This shortsighted approach doesn’t do much for the value chain as a whole. As long as some areas remain trapped under the burden of legacy systems and workflows, they will continue to hold back progress. Despite massive investments of time, effort and money, the magic will fail to materialize.

The Internet of Things, Artificial Intelligence, Augmented/Virtual Reality – think of these different technologies as the different parts of your body. They each excel at their own specific jobs. But in order to achieve anything of real value, they must speak to each other and work in tandem. They must function like parts of a whole, so that the whole can become greater than the sum of its parts. In other words, they must converge!

THE GREAT TECH CONVERGENCE – LIKE HUMANS, BUT BETTER

 
Let’s understand how exactly that will work.

The role of the Internet of Things, which consists of connected edge-devices equipped with sensors, is analogous to your sensory organs which perceive and gather inputs. This functionality is sufficient for use-cases involving asset management, monitoring, surveillance, and collection of data. However, in themselves, connected devices are incapable of making much sense of the data.

That is a job for Artificial Intelligence, which works rather like the brain. Its role is often to help diagnose the problem. AI is experienced in several different ways with capabilities such as Machine Learning, Computer Vision, Natural Language Processing, Deep Learning and so on. These myriad forms of AI have the power to sift through, process and analyze data to generate a complete puzzle from all the scattered pieces and to distil patterns that humans cannot, or at least not easily. AI has the ability to crunch the streams of data coming in from the factory floor, digest it to generate patterns of normal machine behavior, identify any anomalies and then extrapolate to predict when a failure is likely. Armed with this information, your maintenance team is well prepared to proactively prevent breakdowns and slash unproductive downtime.

To stay connected as they work, your maintenance techs need real-time access to updates, technical information and expert support; it helps significantly to have relevant insights instantly accessible and visible, literally. Helping train or prepare for the fix is what Virtual Reality can really make an impact on. Then Augmented Reality can overlay step by step what needs to be done.

Now let’s plug this into a well-defined use-case: Jet Engine Maintenance.

THE GREAT TECH CONVERGENCE IN ACTION (Sample Illustration)

 
Just imagine how this development could completely transform aviation! With minimal downtime and disruption to flight schedules, passengers can be assured of safer and far more convenient air travel. Pilots can breathe easier knowing that their systems are always in top operating condition. The aviation industry stands to save millions in maintenance costs and liabilities.

Connect the dots

Whether your enterprise operates in the retail space, transportation, hospitality, manufacturing, or any other area, you stand to benefit enormously from the synergy of new-age technologies.

To get the most out of your technology deployments, you will need to ensure that it integrates seamlessly into your organizational workflows and doesn’t remain confined to one section of operations. Crucially, it must be scalable and flexible enough to accommodate your growing needs and future requirements. The key to achieving this is to work with a platform that brings you the benefits of varied technologies, in an integrated manner.

Infosys Nia™, our AI platform is designed to deliver just such a seamless and integrated suite of technologies and solutions to help you ensure that all areas of operations feel the benefits and remain in sync. Combined with our AR/VR capabilities, Infosys Nia is actively driving the great tech convergence.

When technologies converge, they perform complementary functions and support each other to create a comprehensive, intelligent ecosystem that can transform enterprises and operations inside out.

Is your organization ready to experience and benefit from this convergence of technologies?

Why Infosys Nia™ Advanced Machine Learning (Part 1)

Machine learning is often mentioned in the same breath as Artificial Intelligence (AI) these days. There are many who lay claim to the uniqueness of their AI offering because it features machine learning (ML). So what sets Infosys Nia apart? Why is Infosys Nia Advanced Machine Learning better positioned to deliver success for businesses? Let us explore.

In March 2017 Infosys acquired Skytree, The Machine Learning Company. The capabilities of Skytree are now found integrated within the Infosys AI platform, Infosys Nia, and are now known as Infosys Nia Advanced Machine Learning. There are three major aspects that make it uniquely effective:

  • Automation
  • Speed & Scalability
  • Ease of Use

Let us examine these briefly.

Automation

Typically in data science, to create a good ML model, the user has to choose which ML algorithm to run given their data (decision tree, deep learning, support vector machine, etc.), then select suitable values and optimally tune any of up to dozens of hyperparameters specific to that algorithm. The only way to do this well is to be an expert.

In contrast, the Infosys Nia AutoModel capability allows a user who doesn’t know any of this to still build ML models, or allows the expert to save a great deal of time that would have been spent in tedious manual model tuning. AutoModel combines our Smart Search through the algorithm hyperparameter space with the ability to tune multiple algorithms in one run, resulting in a model that is competitive with the experts but with very little user effort or time. The AutoModel process is fully general, updating both the algorithm chosen and the hyperparameters tested after every iteration.

In addition to AutoModel, we have further generalized the system to automatically perform feature engineering on an input dataset; this is our functionality called AutoFeaturize. As is well-known, good feature engineering can significantly improve the value of a model, but it can be time consuming. With Infosys Nia, further user time is saved by not having to perform all such engineering manually. Feature engineering is a huge field, and we are working to rapidly expand this capability in future releases.

Speed & Scalability

Infosys Nia Advanced ML is written from the ground-up for speed and scalability. We have benchmarked our algorithms against other software, both commercial and state-of-the-art open source, and have never lost in speed, or in model accuracy, including in all customer proof-of-concept (POC) engagements.

This focus on speed allows users to extract maximum accuracy while using all of their data, a capability that has been worth many millions of dollars to some of our customers.

As a quantitative example, in our command line system we have run gradient boosted decision trees on a 1 trillion element training set (10 billion rows × 100 columns) on a regular Hadoop system, demonstrating near-ideal weak and strong scaling from a single node up to 100 nodes. In a second example, we ran a training set of approximately 500 million rows × 50 columns in the graphical user interface (GUI), giving scale combined with all of our ease-of-use features.

Ease of Use

Infosys Nia Advanced ML is designed to give you the full power and business value of advanced machine learning while remaining easy to use. Through the GUI you can access the full automation and speed & scale capabilities of the system, without having to write any code. For those users who prefer to write their own code, a programmatic interface is available as an SDK using a binding language such as Python. Running your code, for example in a Jupyter notebook, causes the project to simultaneously appear in the GUI, thus allowing real-time interaction. For those who want closer integration with other tools, the underlying API is also available.

The platform effect

Additionally, Infosys Nia Advanced ML can leverage the capabilities of other parts of the Infosys Nia platform. For example, Infosys Nia Data’s extensive set of connectors to data sources such as HDFS, databases, social media, and so on. The platform is also extensible to include outside open source tools, thus also extending the reach of Infosys Nia Advanced ML.

Now it might be said that there are other offerings in the market which may contain aspects of the capabilities of Infosys Nia Advanced ML discussed here. But what is highly unlikely to be found is the combination of all three of these qualities: automation, speed & scalability, and ease of use, in a single product. Additionally, the advanced features unique to Infosys Nia result in a uniquely capable offering designed to appeal to both expert data scientist users and the much wider community of business users and analysts. Therein lies the significance of these coming together within an integrated platform.

Serving an ace with Infosys Nia™

It is very interesting to take a machine, teach it tennis and have it start to talk to us in real-time and be able to provide our fans with that, as it is happening, in context, in real-time.
– Murray Swartzberg, ATP SVP IT and Digital Media.

This was a very clear overarching message laid out for the ATP-Infosys program but it involved making the machine learn tennis first and then to have it generate the right content/insight at the right time, without being repetitive.

ATP has been collecting game data via the chair umpire console since 1991. So, why reinvent stats and scores for a game that has changed little since the 1990s with respect to data collected and statistics generated during a game? Turns out, with the proliferation of data coming from external sources and connected stadiums, it is now possible to provide a much more engaging experience to tennis fans all over the world. It is no longer limited to numbers or win percentages. The consumption models for tennis fans have changed dramatically since the 1990s and 2000s with fans now watching the game on mobile devices and sharing their opinions on social media. The consumption model has evolved for players and their coaches too! Players can now wear devices that can track their service games, movement and provide a personalized coaching plan. There is need to present contextual, real time insights and tell the tennis story with visuals and videos.

Serving up more effectively – telling the story in new ways

The first step was to get the brain behind the machine to learn the game of tennis, so the insights that would follow were not mere regurgitation of ingested data.

ATP already had 26 years’ worth of data collected from 1991 and enhanced statistics data from 2015. We started the journey by first taking 5 years of ball tracking data and 1 year of chair umpire data from the world tour finals. 5 years of ATP World Tour Finals data from Hawk-Eye were also studied for serve placement, winner ratios on forehands and backhands, spins and speeds on serves and shots, etc. and interesting insights were shared with players, coaches and fans. Working with a treasure trove of such data, we were able to bring forth a range of new features for tennis fans and experts.

Infosys ATP Trends

Using the Infosys Nia platform, top 3 pre-match insights – first serve return percentage, double faults and forehand placement – were shown at the ATP World Tour Finals 2015 in London. Parameters such as holding serve and breaking, and double faults at key moments were studied yielding point-by-point insights.

The graphic below was one of the first analyses done on ball spin for Top 8 players.

It showed Nadal had an average shot RPM of 2,597 in 2013 compared to all other Top 8 players.

 

Click here to view this analysis

At the Sydney International 2016, trends were re-imagined only using chair-umpire data, and players from any tournament could be compared against the Top-8 for all data from 2015 January updated till the week prior to the tournament kick off. We found for example, that Troicki serves up an excellent Aces to Double Faults ratio, even better than anyone amongst the Top 8 players!

ATP Leaderboards

In April 2016, ATP Leaderboards were created with a new ability to mine insights. Advanced statistics/indexes similar to other sports like NBA, NFL, and MLB were created with 25 years of data from 1991 and putting it into a consumable form to answer 3 simple questions: Who is the best on serve, return and under pressure? The leaderboards also help fans drill down to the details to see individual statistics like 1st serve % won, etc.

 

Click here to go to the ATP Leaderboards

Live Commentary

We had already deployed an Analytics portal for bloggers and commentators at the Sydney International 2016, providing comprehensive ability to slice and dice the data. By the end of the ATP season in 2016 we were able to go further with a live commentary module that was provided atop the Infosys Nia Data capability. It provided real-time commentary and insights during the Barclay’s ATP World Tour Finals 2016. Real-time data ingestion, statistics calculations, and textual/graphical insights generated by Infosys Nia proved to be game changing capabilities. The module was hosted live and fans were excited to see it for the first time in the game of tennis with just quick trip to the official website! The Live Commentary feature is visible across all the pages and the scores section. This was a big step in trying to make the machine not only do story telling but tell the most relevant story during a point in the game. The machine was now able to automatically generate text commentary along with basic insights. This was further enriched for the ATP World Tour Rome Masters 2017, where we deployed an extended message scores piece. This feature overlays the right graphics to supplement facts and provides a more engaging medium for fans to consume the information.

 

With all these new features enabled for tennis fans and experts, the storytelling aspect during and after a game had evolved on the ATP website with a more visual approach and with more engaging and consumable pictures that were more easily understood by fans and shared on social media. New ways of looking at old statistics were yielding deep insights. It required work to train the machine to sift through the statistics and come up with insights that would be most relevant for a fan. Infosys Nia had served up an ace.

The Infosys Nia difference

With Infosys Nia, we had multiple options to acquire the data (both streaming as well as batch) and had various options to store the data. For most of the structured data, there are a number of relation stores available out-of-the-box, including MySQL and Postgres. For unstructured documents, news and social media based data there is provision to use a Hadoop file system and for several other read performance efficient scenarios there is HBase and Cassandra available. With a variety of data stores, there is no limitation on having a fixed schema for the data coming in or the need to have additional processes aimed at transactional integrity for all data. The querying part is made extremely fast by leveraging Spark SQL and/or using Cassandra based queries.

Key portions of Infosys Nia have been used for developing the technology behind advanced statistics and insights. The technology however is not one-size-fits-all. The stack provides inexpensive and redundant storage with fault tolerance and high availability. The real value provided was to mine through the information once it was collected, cleaned and pre-processed.

At the net

So what is the future of tennis scores and stats and what are the next steps and ways that fans, bloggers, sponsors, and coaches can use this information? The core Infosys Nia platform can be leveraged for multiple aspects including acquiring data, analyzing, and presenting it. Infosys Nia is capable of working with advanced machine learning algorithms which can be leveraged to predict outcomes of various scenarios pertaining to matches. With a data platform and an adaptable engine at the core, it is now possible to engage the entire tennis ecosystem and to drive synergies in harnessing the data for more effective story telling.

These capabilities position ATP World Tour to keep evolving the art of telling the story of tennis as it unfolds. In the words of Murray Swartzberg, “Our sport is a story and the better you tell the story, the better it is for the sport and the fans.”

Banking on Things

IoT is the interconnection of uniquely identifiable embedded computing devices within the existing Internet infrastructure. IoT is expected to offer advanced connectivity of devices, systems, and services that goes beyond machine-to-machine (M2M) communications and covers a variety of protocols, domains, and applications. In the financial services space, the interconnection of these embedded devices is expected to usher in automation in several legacy processes.
As IoT led digitization begins to take root, new business models and products are emerging. This is opening up new frontiers of innovation that can potentially reshape customer experiences, and throw up clear winners or losers in the financial services sector.
In my blog published earlier this year – The Beacon Beckons: Banks and the Internet of Things, I had discussed how financial institutions can tap into IoT to automate, integrate and build intelligence into each of their processes to create immense value at every stage. In this blog, I have listed use cases that may be adopted in banking in a time span ranging from near-term to long-term.

  1. Automated Payment through Things
    When moving on to payments, integration of IoT and payment functionality will lead to greater number of payment endpoints.Beyond the clichéd milk ordering refrigerator, we are already starting to see the beginning of the use of connected devices and wearables, for instance, payment through Apple Watch or the fitness band Jawbone. When machines are able to perform transactions with machines in real-time on a marginal cost basis, the traditional concept of payments will become obsolete in many use cases as transactions become automated and integrated into other services – virtually any “thing” could include an automated payment experience. Though the IoT raises certain security concerns, personal biometrics and digital identities could potentially increase security in payments, if done right. Eventually the opportunity extends not only to the end user, for whom automated payments will lead to greater convenience and smarter transactions, but to banks, payments companies, retailers, and technology manufacturers.
  2. Risk Mitigation in Trade Finance
    Tracking of high value goods delivery using RFID is already reality in the trade finance space. IoT will accelerate this to include fine-grained tracking of the asset, for instance, monitoring temperature of the container for shipments involving temperature sensitive goods such as pharmaceuticals and medicinal molecules. Alerts could be triggered if there is a chance of spoilage during the shipment process – say one of the parameters being monitored goes out of bounds. These implementations can result in risk mitigation and more informed decision making at banks for scenarios involving trade finance.
  3. Wallet of Things
    As an extension of automated payment through things, when more devices become digital and “smart”, it will be possible to have wallets associated with each device. For instance, an autonomous car could potentially pay for parking, gas, rental or even maintenance service using its embedded wallet. Each and every home appliance or consumer equipment could eventually host an embedded, pre-funded wallet that is capable of managing its running expenses on its own.From an owner’s perspective, a digital identity based “wallet of things” might provide an integrated view of costs and expenses associated with owned or leased devices.IoT has the potential to reimagine banking as we know it completely. And it is more important than ever for banks to look at providing services and products on the channels that their customers prefer. In 2017 and beyond we will see progressive banks take it a step further and provide “banking on things” – which can be anything from a smart car, to smart walls.

Click here to read my article on ‘IoT in Banking – Enabling Banks’ Digital Future’ for the complete list of 12 use cases that may be adopted in banking in a time span ranging from near-term to long-term.

Future of Banking – The Platform takes over

Uber is arguably the most disruptive eight-year old in history. It has also inspired a breed of precocious companies that have uberized everything from hospitality to professional services. In financial services, glimpses of the same can be seen in the form marketplace lending and open banking initiatives. But total uberization? In my view, banking-as-a-platform in its evolved form is still some years away.
In fact, as of today, there isn’t even a common understanding of banking as a platform. The authors of “Pipelines, Platforms, and the new rules of Strategy” describe platform as – “A platform provides the infrastructure and rules for a marketplace that brings together producers and consumers. The players in the ecosystem fill four main roles but may shift rapidly from one role to another.” The roles defined in the book are: “Owners” of the platform that take care of the governance and IP; “Providers”, who make it possible for the producers and users to interact via a common interface; and finally “Producers” that have the offerings for “Consumers” who use these offerings. Basically, the platform acts as the matchmaker that provides the right environment for the providers, producers, and consumers to interact seamlessly.
Clearly, the platform concept has been around for quite some time – think of your neighborhood convenience store that gave local community counter space to advertise their wares. Or the classifieds of yore that provided a platform for buyers and sellers to connect and advertise their wares. What has changed is that a number of powerful forces have come together to give platform businesses reach, agility and feasibility that they could only have dreamt of in the pre-digital days.
The first of these is plain economics. In the current interest rate environment, banks’ core source of revenue has shrunk, particularly in the developed markets of the United States and Europe. Highly competitive offerings from non-banking players have driven down banks’ income further. Hence the proposition of a digital platform model, which runs on a really low operating cost and distributes risk among several parties, is hugely appealing to providers.
Secondly, consumers are flocking not to a particular bank, but to a particular value. Again thanks to technology shifting banks and opening new relationships have never been so easier. Today, customers will go to that provider who offers them the best, most relevant service and experience. Unlike the brick and mortar bank, which could only be personally attentive to a few, select customers, the platform bank with access to data and insights can personalize at scale.
Thirdly, a highly efficient platform bank can, at least in theory, reach any customer anywhere in the world over its digital channels. It can scale up at speed. And thanks to the ecosystem and open banking, it can also engage the customers of other banks in a servicing relationship, or by selling third party products. If the example of other platform businesses has taught us anything it is that eventually, couple of innovative players will rule the business. Every bank (and non-bank) wants to be the winning platform that takes all.
But it will take some doing to get there. To start with, banks have to transform their ownership mindset (I will serve my customers with my products) with the clear realization that the only thing that matters is to give customers what’s best for them, and if that means serving up a rival product, then so be it. They will also have to expose their APIs to allow developers to create the right offerings on their platform. Banks also need to shift their business focus from earning interest income to monetizing insights from data. For example, if the insight says that a customer should be offered a third party product, then the bank must charge a fee from that party for generating this lead.
These are massive changes, yet they are feasible and perhaps inevitable also. A bank that makes the crossover safely will be able to differentiate itself from other banks purely on the strength of its quality of experience.
That being said, not all banks are cut out to make this shift. A platform mindset requires a certain kind of leadership and culture, which most banks will struggle to provide. Some of them might choose to become product manufacturing specialists, and distribute their offerings through platform banks.
We believe that the “platformization” of banking has already begun. The examples like Deutsche Bank hub for SMEs or LendingClub, and WeChat are the early reflections on the new model emerging in the industry. And will there ever be an Uber-size platform among banks? With the industry being as regulated as it is and the rules for systemic risk around large banks, it looks rather difficult. Still, that is what the ambitious, progressive banks will aspire to. But Uber-sized or not, banks that succeed in their platform play will pull far ahead of their rivals. And those that don’t might be relegated to the role of a backend utility. If they survive, that is.

IoT in Banking – Enabling Banks’ Digital Future

IoT has the potential to impact traditional business processes in banking such as KYC, lending, collateral management, trade finance, payments, PFM, and insurance. Coupled with other emerging technologies, such as digital identity and smart contacts, IoT can create new P2P business models that have the potential to disrupt banking in a few areas. Listed below are few use cases that may be adopted in banking in a time span ranging from near-term to long-term.

  • Account Management on Things
    As more devices acquire digital interfaces, the term “mobile” or “digital” banking will acquire new meaning and customers will be able to access their bank accounts from practically any “thing” that has a digital interface – for instance, from entertainment systems in autonomous cars or planes.
    Banks will be aware of the context of the channel and can provide appropriate contextualized service or advice enriching the interaction experience. Biometrics – voice or touch – can simplify account access in these new “anywhere” digital channels. Processes requiring physical signatures could use “Wet Ink” technology, i.e. The customer can remotely sign through any touch screen device and the signature can be cloned onto physical paper with “Wet Ink”. This will eliminate barriers associated with in-person, paper-based transactions and enable clients to conduct business even when they cannot be physically present.
  • Leasing Finance Automation
    Real-time monitoring of wear and tear of assets as well as metrics like asset usage and idle time could provide important data points for pricing of leased assets. This could lead to introduction of a new daily leasing model for a wide variety of digitally enabled assets – effectively turning even traditional products into services. Terms of leasing could be simplified and automated as the bank wields greater control over the leased asset. For instance, in case of contract termination or default, the leased asset could be locked or disabled remotely by the bank.
  • Smart Collaterals
    IoT technology can enable banks to have better control over a customer’s mortgaged assets, such as cars, and also monitor their health. In such a scenario, a retail or SME customer could possibly raise short-term small finance by offering manufacturing machinery, cars, or expensive home appliances as collateral. The request for financing as well as the transfer of ownership could be automatic and completely digital. Enabled by digital identity for people as well as things, the transfer of ownership of an asset can be achieved in a matter of seconds. The bank can then issue the loan immediately, and monitor the collateral status in real-time without the need to take physical custody of the asset. The bank can remotely disable or enable the machine/motor anytime based on defined business rules. For instance, in case loan EMIs are not paid, the engine could be disabled. The quality of the collateral can also be monitored in near real-time.
    IoT has the potential to reimagine banking as we know it completely. And it is more important than ever for banks to look at providing services and products on the channels that their customers prefer. In 2017 and beyond we will see progressive banks take it a step further and provide “banking on things” – which can be anything from a smart car, to smart walls.

Click here to read my article on ‘IoT in Banking – Enabling Banks’ Digital Future’ for the complete list of 12 use cases that may be adopted in banking in a time span ranging from near-term to long-term.