Categories
Insights

How Alternative Data Can Transform Your Business Intelligence

EDITOR’S NOTE: This article is about harnessing new sources of Alternative Data. True Interaction built SYNAPTIK, our Data Management, Analytics, and Machine Learning Platform, specifically to make it easy to collect and manage core and alternative data/media types for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

Big data has been commonly described over the last few years through properties known as the “3 V’s”: Volume, Velocity, and Variety. If you are a human being just about anywhere in the world today, it’s patently obvious to you that these three dimensions are increasing at an exponential rate.

We’ve seen the staggering statistics with regards to Volume and Velocity reported and discussed everywhere:

Big Volume
IDC reported that the data we collectively create and copy globally is doubling in size every two years. Calculated at 4.4 zettabytes in 2014, the organization estimates global data will reach 44 zettabytes — that’s 44 trillion gigabytes — by 2020.
Cisco forecasts that overall mobile data traffic is expected to grow to 49 exabytes per month by 2021, a seven-fold increase over 2016. Mobile data traffic will grow at a compound annual growth rate (CAGR) of 47 percent from 2016 to 2021.

Big Velocity

Facebook’s 1.97 billion monthly active users send an average of 31.25 million messages and view 2.77 million videos every minute.

Twitter’s 308 million monthly active users send, on average, around 6,000 tweets every second. This corresponds to over 350,000 tweets sent per minute, 500 million tweets per day and around 200 billion tweets per year.

Big Variety = Alternative, Non-traditional, Orthogonal Data

These well-touted figures often leave one feeling aghast, small, and perhaps powerless. Don’t worry, the feeling is mutual! So today, let’s get ourselves right-sized again, and shift our focus to the 3rd dimension — of big data, that is — and examine a growing, more optimistic, and actionable business trend concerning big data that is materializing in organizations and businesses of all kinds, across just about any industry that you can imagine, without regard for business size or scope. Let’s examine the explosion of big data Variety, specifically with regards to harnessing new and emerging varieties of data to further inform reporting, forecasting, and the provision of actionable BI insights.

In a pattern similar to online retail’s “Long Tail” — the emphasis of niche products to consumers providing that emerged in the 2000’s — more and more future-leaning businesses are incorporating outside, alternate “niches” of data that differ from the traditional BI data sources that standard BI dashboards have commonly provided.

In a recent interview in CIO, Krishna Nathan, CIO of S&P Global explained that “Some companies are starting to collect data known as alternative, non-traditional or orthogonal.” Nathan further describes Alternative Data as the various data “that draw from non-traditional data sources, so that when you apply analytics to the data, they yield additional insights that complement the information you receive from traditional sources.” Because of the increasing prevalence of data from mobile devices, satellites, IoT sensors and applications, huge quantities of structured, semi-structured and unstructured data have the potential to be mined for information and potentially help people make better data-driven decisions. “While it is still early days for this new kind of data”, Nathan says, “CIOs should start to become familiar with the technologies now. Soon enough, alternative data will be table stakes.”

In The Field

Let’s examine the various applications of these new data sources that are manifesting themselves in parallel with the burgeoning technological advancements in our world today.

VC and Credit

Alternative data is increasingly wielded by VC firms as well as the credit industry to lend insight into backing startups, businesses, and technologies. Many small businesses, especially those with a limited credit history, have difficulty demonstrating creditworthiness and may be deemed as high risk when viewed through the lens of traditional data sources.

However, Experian recently described the growing number number of online marketplace lenders, or nonbank lenders, that have already begun taking a nontraditional approach by leveraging a wealth of alternative data sources, such as social media, Web traffic, or app downloads to help fill the void that a business with limited credit history might have. By combining both traditional and nontraditional data sets, these lenders are able to help small businesses access financial resources, while expanding their own portfolios.

Health

Patient information continues to be collected through traditional public health data sources, including hospital administration departments, health surveys and clinical trials. Data analysis of these sources is slow, costly, limited by responder bias, and fragmented.

However, According to MaRS DD, a research and science-based entrepreneurial venture firm, with the growing use of personal health applications among the public, self-reported information on prescription drug consumption and nutritional intake can be analyzed and leveraged to gain insight into patient compliance and use patterns, as well as into chronic disease management aptitude in between visits to frontline healthcare practitioners. In addition, social media platforms can be used as both monitoring tools and non-traditional methods of increasing patient engagement, as they allow healthcare professionals to interact with populations that under-utilize services. Healthcare organizations can mine social media for specific keywords to focus and project initiatives that track the spread of influenza, zika, or opioid addiction, for example, or even to provide real-time intervention.

Retail, Dining, Hospitality and Events

Several different kinds of data sources can give these industries a bigger picture and aid in both more granular reporting, but also more accurate forecasting. For example, Foursquare famously predicted that Chipotle same-store sales would fall 29 percent after the Mexican chain was hit with E. coli outbreaks, based upon check-ins on their application. The actual decline announced by Chipotle ended up being a spot-on 30 percent. It’s no coincidence that Foursquare recently announced Foursquare Analytics, a foot traffic dashboard for brands and retailers.

In addition, by making use of CCTV or drone imagery, incredible insight can be garnered from examining in-store foot traffic or the density of vehicles in a retailer’s parking lot over time. Today, a combination of Wi-Fi hotspots and CCTV cameras can compile numbers about in-store customer traffic patterns in the same way that online stores collect visitor and click information. For example, by using a modern CCTV system to count the number of people in each part of the store, heatmap analytics can visualize “hot zones” — to help maximize in-store promotional campaigns, and identify “cold zones” to determine how store layout changes can improve customer traffic flow.

Don’t forget the weather! By leveraging a real-time weather data analytics system in order to process historical, current, and forecasted weather data, retailers can predict how shifting demands will affect inventory, merchandising, marketing, staffing, logistics, and more.

Wall Street

You can bet your life that investment firms are early adopters of alternative data sources such as in the Chipotle-Foursquare story mentioned earlier. Consider the incredible resource that satellite imagery is becoming — it’s not just for government intelligence anymore: Because satellite imagery now enables organizations to count cars in retailers’ parking lots, it is possible to estimate quarterly earnings ahead of a business’ quarterly reports. Data analysts can use simple trigonometry to measure the shadows cast by floating oil tank lids in order to gauge the world’s oil supply. By monitoring vehicles coming and going from industrial facilities in China, it’s even possible to create a nascent China manufacturing index. Infrared sensors combined with satellite images can detect crop health far ahead of the USDA. All of this makes a boon for traders and investors.

What About Your Organization?

No matter the size of your business, now is the time to consider upgrading your 2000’s-era BI Dashboards to incorporate alternative data sources — remember, the convergence of IoT, cloud, and big data are creating new opportunities for analytics all the time. Data is expected to double every two years, for the next decade. Furthermore, it is essential to integrate all of these data opportunities with traditional data sources in order to create a full spectrum of analytics, and drive more intelligent, more actionable insights.

The Right Analytics Platform

Legacy data management systems that have not optimized their operations will not be able to process these new and disparate sources of alternative data to produce relevant information in a timely manner. The lack of machine learning mechanisms within these sub-optimal systems will hinder businesses in their knowledge discovery process, barring organizations from making data-driven decisions in real time.

According to Joe Sticca, Senior Executive of Digital Transformation & Data Science for True Interaction, “The most deleterious disadvantage of failing to address these pressing issues… is the careless neglect of invaluable business insight that is concealed in the mass of available data. Now, more than ever, businesses of all size need the ability to do great data discovery, but without necessitating a deep core technical development and data analyst skillset to do so.”

One solution path? Cutting-edge fully-managed data and machine learning platforms like Synaptik, that make it easy to connect with dozens of both structured and unstructured data services and sources, in order to gain the power of algorithms, statistical analysis, predictive modeling and machine learning, for a multitude of purposes, and metrics such as brand sentiment, campaign effectiveness and customer experience. Synaptik helps businesses transform via an adaptive, intuitive and accessible platform – using a modern mix of lightweight frameworks, scalable cloud services, and effective data management and research tools. More importantly, it works with non-IT skill sets to propagate better pattern recognition across your organization’s people and divisions.

(infographic by Quandl.com)

by Michael Davison

Categories
Insights

As Online Video Matures, New Data Challenges Emerge

As 2017 drives on, we’ve seen the continued evolution of digital media, mostly surrounding video, especially with regards to live streaming and mobile. It’s paramount for any organization, regardless of size, to be aware of these trends on order to best take action and capitalize on them.

Mobile, OTT, Live

More and more video is being produced for and consumed on mobile. The weekly share of time spent watching TV and video on mobile devices has grown by 85% since 2010. Mobile will account for 72% of US digital ad spend by 2019. Traditional plugged-in cable TV continues to decline, as audiences demand to consume their media, wherever and whenever they want.

Over-the-top content (OTT) is audio, video, and other media content delivered over the Internet without the involvement of a multiple-system operator (MSO) in the control or distribution of the content – think Netflix and Hulu over your traditionally HBO cable subscription. It’s becoming an increasingly important segment of the video viewing population, and the rising popularity of multiple OTT services beyond Netflix only suggests that the market is poised for more growth. According to comScore, 53% of Wi-Fi households in the U.S. are now using at least one over-the-top streaming service, with Netflix being the primary choice.

Meanwhile, the Live streaming market continues to explode, expected to grow to $70.05B by 2021, from $30.29B in 2016. Breaking news makes up 56% of most-watched live content, with conferences and speakers tied with concerts and festivals in second place at 43%.

The usual giants are leading the charge with regards to propagating and capitalizing on live streaming; in June of 2016, it was reported that Facebook had paid 140 media companies a combined $50m to create videos for Facebook Live. Industry influencers predict that we will see major brands partner with live broadcasting talent to personalize their stories, as well as innovate regarding monetization with regards to live video. We might even see the resurgence of live advertising, according to food blogger Leslie Nance in a report by Livestream Universe. “I think we will be seeing more of live commercial spots in broadcast. Think Lucy, Vita Vita Vegimen. We will come full circle in the way we are exposed to brands and their messages.”

However, one of the greatest advantages of live streaming is its simplicity and affordability – even small business owners can – and should – leverage its benefit. Says Content Strategist Rebekah Radice,

“Live streaming has created a monumental shift in how we communicate. It took conversations from static to live, one-dimensional to multi-faceted. I predict two things. Companies that (1) haven’t established relationships with their social media audience (invested in their community – optimized the experience) and (2) don’t extend that conversation through live streaming video (created an interactive and open communication channel) will lose massive momentum in 2017.

The Social Media Connection

Social Media is used especially in concert with live video. Because live streaming propagates a feeling of connectedness – very similar to the eruptions of activity on Twitter during unfolding current events – live streaming also inspires more simultaneous activity, especially with regards to communication. Consumers conduct more email, texting, social media use and search while streaming live video than on-demand or traditional TV viewing.
At the beginning of 2016, Nielsen introduced Social Content Ratings, the most comprehensive measure of program-related social media activity across both Facebook and Twitter to account and capture this trend. “With social media playing an increasing role in consumers’ lives and TV experiences, its value for the media industry continues to mature,” said Sean Casey, President, Nielsen Social, in a press release for the company.
By measuring program-related conversation across social networking services, TV networks and streaming content providers can better determine the efficacy of social audience engagement strategies, as well as bring more clarity to the relationship between social activity and user behaviors while watching.

Nielsen says that the ratings system will support agencies and advertisers in making data-driven media planning and buying decisions as they seek to maximize social buzz generated through ad placements, sponsorships, and integrations.

Deeper Analytics, More Challenges

Besides Nielsen’s new Social Content Ratings, we are already seeing major tech platforms like Google and Facebook roll new analytics features that allow users to filter audiences by demographics like age, region, and gender. In the near future, these analytics will become even more complex. Certainly, more sophisticated forms of measuring user engagement will enable advertisers to learn more about how users respond to messaging, with a benefit of building campaigns more cost efficiently, provided they have the ability to see, compare, and take action on their disparate data. One of the main challenges being discussed that faces the market today is the effective integration of digital data with traditional data sources to create new and relevant insights.

There is a deluge of data that is generated through non-traditional channels for media and broadcasting industry such as online and social media. Given the volumes, it is impossible to process this data unless advanced analytics are employed. ~Inteliment

The Proper Data Solution

As we become more accustomed to this “live 24/7” paradigm, the onus is on organizations to ensure that they are properly deriving actionable data from this increasing myriad of sources, so that they may better:

-Assess audience participation and engagement
-Measure the efficacy of media content
-Predict and determine user behaviors
-Plan advertising budget

According to Joe Sticca, Senior Executive of Digital Transformation & Data Science for True Interaction, “…the most deleterious disadvantage of failing to address these pressing issues… is the careless neglect of invaluable business insight that is concealed in the mass of available data.”

Thus, data management systems that have not optimized their operations will not be able to process data to produce relevant information in a timely manner. The lack of machine learning mechanisms within these sub-optimal systems will hinder businesses in their knowledge discovery process, barring organizations from making data-driven decisions in real time. Mr. Sticca concludes that “…now, more than ever, businesses of all size need the ability to do great data discovery, but without necessitating a deep core technical development and data analyst skillset to do so.”

One solution path? Cutting-edge fully-managed data and machine learning platforms like Synaptik, that make it easy to connect with dozens of both structured and unstructured data services and sources, in order to gain the power of algorithms, statistical analysis, predictive modeling and machine learning, for a multitude of purposes, and metrics such as brand sentiment, campaign effectiveness and customer experience. Synaptik helps businesses transform via an adaptive, intuitive and accessible platform – using a modern mix of lightweight frameworks, scalable cloud services, and effective data management and research tools. More importantly, it works with non-IT skillsets to propagate better pattern recognition across your organization’s people and divisions.

By Michael Davison

Categories
Insights

The Technology Solution to the IoT Conundrum

Unhindered in its incessant growth, the Internet of Things (IoT) continues to increase its network of connected devices exponentially. Gartner predicts that a staggering 20 billion connected devices will be in existence by 2020. To put into further context, the current trajectory of the IoT will soon usher in an age where there are, on average, three connected devices for every living person. In keeping with Gartner’s research, this fast-growing industry will soon be powering a market worth upwards of $3 trillion. An explosive growth in any industry is always accompanied with a barrage of data, a challenge that we here at True Interaction understand well. The issues associated with capturing vast amounts of data easily, both structured and unstructured, is a critical barrier point in the pursuit of data discovery and we provide solutions to these data management difficulties by means of our platform Synaptik.

While the unimpeded growth of IoT is indeed quite promising, we cannot simply dismiss the unique set of challenges that accompany such a sudden and chaotic influx of devices. The current infrastructure and infrastructure of Internet and online services are hardly primed to handle the identifying, connecting, securing, and managing of so many devices. The difficulties posed by large IoT ecosystems may be considered as a problem for the future, but the technology that can address these issues potentially exists now.

Blockchain as a Solution?

Hailed for the transparency, accuracy, and permanence that is inherent in its process, our previous post explains that blockchain “create[s] a digital ledger of transactions that can be shared among a distributed network of computers.” The utilization of cryptography allows each account on the network to access and manipulate the ledger securely, decentralizing the process and essentially eliminating the need for a middleman. Finance had one of the first and most notably successful implementation of blockchain through Bitcoin, and the industry is seemingly eager to embrace the technology even more. We have covered the meteoric rise in importance and interest that blockchain technology has been attracting in a comprehensive post, as well as its applications in the media and entertainment industry. However, this instance, with its suggested applications in the IoT, is especially momentous in that it observes the convergence of two recently developed technological sectors. So how will the blockchain model assist the IoT industry in its promising ascent?

Security

IoT’s impressive growth is an assuring testament to its bright future as well as its crux. A centralized model of security has been effective in the past, but it is not nearly equipped to handle network nodes that balloon up to the millions from devices that also conduct billions of transactions. Not only will the computational requirements (and costs!) skyrocket, but expanding a network to that size will inevitably cause servers to become a bottleneck. The chaotic and highly vulnerable state that this puts servers in will make it susceptible to Denial of Service (DoS/DDoS) attacks, where servers are targeted and brought down by being flooded with traffic from compromised devices.

The organization inherent in the system gives blockchain the ability to create secure mesh networks that allow devices to connect reliably to one another. Once the legitimacy of a node has been secured and registered on the blockchain, devices will be able to identify and authenticate each other without the need for central brokers. An added benefit of this model is its scalability and the way it can be expanded to support a billion devices while barring the need for additional resources. IBM extrapolates the results of blockchain’s impeccably accurate and transparent record-keeping capabilities by anticipating more trust to form between people and parties, thus making transactions run more seamlessly. Chris O’Connor, General Manager, Internet of Things Offerings for IBM, illustrates the concept:

 

“While Person A may not know device B and may not trust it implicitly, the indelible record of transactions and data from devices stored on the blockchain provide proof and command the necessary trust for businesses and people to cooperate.”

 

Self-Regulation

What is a common feature in the most expansive imaginings of a technologically unmatched world? Typically, the height of technological success is marked by the existence of self-sustaining machines. It may astonish people to learn that the means for creating devices that have little to no need for human interference already exists. IBM and Samsung have partnered together in developing a concept known as ADEPT (Autonomous Decentralized Peer-to-Peer Telemetry). The project chose three protocols (file sharing, smart contracts, and peer-to-peer messaging) to underpin the seminal concept.

One of the most interesting proposals for the use of this technology is the enabling of devices to autonomously maintain themselves. IBM’s draft paper features lofty goals that include devices not only being capable of signaling operational problems, but also being able to retrieve software updates and potentially address its self-diagnosed issues. The ADEPT technology is intended to accomplish the incredible feat of allowing devices to communicate with other nearby devices in order to facilitate power bartering and energy efficiency. Machines that work in accordance with consumables will be able to restock their own supplies as well. This feature will be available in a Samsung W9000 washing machine. Wielding the ADEPT system, this Samsung washing machine will use smart contracts to issue commands to a detergent retailer that gives the device the ability to pay for an order itself and later receive word from the retailer that the detergent has been paid for and shipped.

Smart Contracts

In the digital age, with the emergence of a slew of transaction systems, blockchain is being heralded as the next logical step. At the heart of blockchain technology is its unique penchant for transparency and demand for accountability from its users. Moreover, its decentralized process negates the need for intermediaries. These unique features make blockchain a feasible platform on which to conduct smart contracts. Co-opting the technology’s intended transactional use, “contracts could be converted to computer code, stored and replicated on the system and supervised by the network of computers that run the blockchain.” The exchange of money, property, or anything of value in a conflict-free manner sans a middleman to broker the deal exists because of blockchain technology.

Blockchain is just one of many frameworks and sources of data into which Synaptik can be easily integrated. Data management is the most critical piece in the seamless execution of a successful data discovery process that is capable of gleaning answers to questions you may not have known to ask.

For more information to empower your data science initiatives please visit us at www.Synaptik.co. We pride ourselves in our ability to empower every day users to do great data discovery without the need for deep core technical development skills.

Denisse Perez, Content Marketing Analyst for True Interaction, contributed to this post.

by Joe Sticca

Categories
Insights

Evolution of Big Data Technologies in the Financial Services Industry

Our previous post provides an industry analysis that examines the maturity of banking and financial markets organizations. The significant deviations from the traditional business model within the financial services industry in the recent years emphasize the increasing need for a difference in how institutions approach big data. The long-standing industry, so firmly entrenched in its decades-long practices, is seemingly dipping its toes into the proverbial pool of big data as organization recognize that its implementation is integral to a firm’s survival, and ultimately its growth. IBM’s Big Data @ Work survey reports that 26 percent of banking and financial markets companies are focused on understanding the concepts surrounding big data. On the other end of the spectrum, 27 percent are launching big data pilots, but the majority of the companies surveyed in this global study (47 percent) remains in the planning stage of defining a road map towards the efficient implementation of big data. For those organizations still in the stage of planning and refinement, it is crucial to understand and integrate these observed trends within financial technologies that can bolster a company’s big data strategy.

Customer Intelligence

While banks have historically maintained the monopoly on their customer’s financial transactions, the current state of the industry, with competitors flooding the market on different platforms, prevents this practice to continue. Banks are being transformed from product-centric to customer-centric organizations. Of the survey respondents with big data efforts in place, 55 percent report customer-centric objectives as one of their organization’s top priorities, if not their utmost aim. In order to engage in more customer-centric activities, financial service companies need to enhance their ability in anticipating changing market conditions and customer preferences. This will in turn inform the development and tailoring of their products and services towards the consumer, swiftly seizing market opportunities as well as improving customer service and loyalty.

Machine Learning

Financial market firms are increasingly becoming more aware of the many potential applications for machine learning and deep learning, two of the most prominent uses being within the fraud and risk sectors of this industry. The sheer volume of consumer information collected from the innumerable amount of transactions conducted through a plethora of different platforms daily calls for stronger protocols around fraud and risk management. Many financial services companies are just beginning to realize the advantageous inclusion of machine learning within an organization’s big data strategy. One such company is Paypal, which, through a combination of linear, neural network, and deep learning techniques, is able to optimize its risk management engines in order to identify the level of risk associated with a customer in mere milliseconds. The potential foreshadowed by these current applications is seemingly endless, optimistically suggesting the feasibility of machine learning algorithms replacing statistical risk management models and becoming an industry standard. The overall value that financial institutions can glean from the implementation of machine learning techniques is access to actionable intelligence based on the previously obscured insights uncovered by means of such techniques. The integration of machine learning tactics will be a welcome catalyst in the acceleration towards more real-time analysis and alerting.

IoT

When attempting to chart the future of financial technology, many point to the Internet of Things (IoT) as the next logical step. Often succinctly described as machine-to-machine communication, the IoT is hardly a novel concept, with the continual exchange of data already occurring between “smart” devices despite the lack of human interference. As some industries, such as in retail and manufacturing, already utilize this technology to some extent, it is not a far-fetched notion to posit that the financial service industry will soon follow suit. While there are those who adamantly reject the idea due to the industry being in the business of providing services as opposed to things, this would be a dangerously myopic view in this day and age. Anything from ATMs to information kiosks could be equipped with sensing technology to monitor and take action on the consumer’s’ behalf. Information collected from real-time, multi-channel activities can aid in informing how banks provide the best, most timely offers and advice to their customers.

For more information to empower your data science initiatives please visit us at www.Synaptik.co. We pride ourselves to empower every day users to do great data discovery without the need for deep core technical development skills.

Joe Sticca, Chief Operating Officer of True Interaction, contributed to this post.

By Justin Barbaro

Categories
Insights

Technological Disruptions in the Financial Services Industry

The financial services industry has long boasted a resilient and steadfast business model and has proven to be one of the most resistant sectors when it comes to disruption by technology. In the recent years, however, a palpable change is overturning the ways and processes that have upheld this institution for so long. Organizations within this historically traditional and unyielding sector are realizing the need to not only assimilate into the digital era, but to embrace and incorporate it fully or else be overtaken but others who have opted to innovate rather than succumb under the pressure of this increasingly competitive industry.

Changing Dynamics

The relationship between banks and their customers have drastically changed from the time during which the traditional banking model was formulated. Perhaps more than any other commercial enterprise, banks retained control over the relationships they had with their customers. For the most part, the bank with which an individual was aligned often determined his or her financial identity as nearly all transactions were administered through one’s bank. Moreover, McKinsey reports that, historically, consumers very rarely flitted between different service providers because of the promising image of stability that the industry has worked hard to maintain. While this may have been the case in the past, the relational dynamics between banks and their customers are not the same today as they were nearly a decade or so ago.

Instead of being reliant to a single bank for all financial dealings, consumers have more options at their disposal, which they are taking full advantage of by engaging in transient relationships with multiple banks such as “a current account at one that charges no fees, a savings accounts with a bank that offers high interest, a mortgage with a one offering the best rate, and a brokerage account at a discount brokerage.” Competition between financial institutions is undeniably fiercer than ever, and it turns out that consumers are also being courted by new peer-to-peer services, such as PayPal, that allow those who opt to use these services to conduct financial transactions beyond the traditional banking means and organizations.

Data Growth

The sheer rise in players and competitors within this industry alone is enough to indicate another glaring issue: the sudden growth in volume of financial transactions. More transactions leads to an explosion of data growth for financial service providers, a predicament for which not many organizations are adequately prepared to handle. A study conducted by the Capgemini/RBS Global Payments estimates that the global volume for electronic payments is about 260 billion and growing between 15 and 22% for developing countries. The expansion of data points stored for each transaction, committed on the plethora of devices that are available to the consumer, is causing difficulties in the active defense against fraud and detection of potential security breaches. Oracle observes a shift in the way fraud analysis is being conducted, with it previously being performed over a small sample of transactions but which now necessitates the analysis of entire transaction history data sets.

Timely Insight

Shrinking revenues is one of the most prominent challenges for financial institutions to date, calling for a need to improve operational cost efficiencies. New financial technologies are being developed to address issues like this by leveraging the amount of available data that these financial institutions have access to and monetize it. Traditional Business Intelligence tools have been a staple in the industry for years, but it is usually limited in its capacity. A lot of the available BI tools work well in conjunction with business analysts when they are looking to find answers and solutions to specific conundrum. The key to revamping traditional banking frameworks in order to make it more competitive and agile in the current environment is to build and incorporate processes that are capable of revealing patterns, trends, and correlations in the data. Oracle posits that disruptive technologies in the financial sectors need to be able to do more than report, but also uncover. The technological ramifications of an evolving financial industry with a continuously expanding amount of data and the demand for real-time, data-driven decisions include the ability to detect unanticipated questions and simultaneously provide tangible solutions.

Denisse Perez, Content Marketing Analyst for True Interaction, contributed to this post.

By Joe Sticca

Categories
Insights

Media Distribution and Syndication

With the advent of television syndication, no viewer is left unreached and no potential market left untapped by advertisers. In 2016, advertising revenue resulting directly from television syndication approached $1.86 billion. This figure, while still significant on its own, is notably less than previous years as profits have been steadily declining since its most recent peak of $2 billion in annual revenue recorded in 2013. As television is finding less and less areas to expand to even on a global scale, the digital space is experiencing sustained and relentless growth. The expansion of the digital universe includes the development of new platforms on which to consume a plethora of video content, some of which were previously only available on television. The transition in preference away from the traditional television set is liberating for consumers of video media, but poses a new set of difficulties for content creators.

Distribution

Following the model of television syndication transcending geographic limitations, any video media must be able to traverse the saturated digital space well in order to maximize exposure. However, the means of disseminating content across the web are much more varied and tedious than in the days of just television broadcasting. The concepts of distribution and syndication in regards to digital media are often conflated together. The two terms, however, convey different ideas. According to Kaltura, distribution is perceived to be the simpler of the two processes as it only entails the submission of content to a third party host, who is then responsible for plugging it into their own player. The most recognizable of these third party hosts and an apt example would be YouTube.

Many content creators who opt to use third party hosts to distribute their media are drawn to the simplicity and ease of the process. The need to choose and design a player is eliminated and they are also not responsible for selling advertising. Hosts that have native apps would also allow video media to be viewed on more devices, and since the creator does not pay to stream the content, there are fewer upfront distribution costs involved.

The problem with distribution often lies in control, or lack thereof. Essentially, this option requires the one submit their content as a single, unaccompanied unit. The host determines how the media will be displayed and presented and any further alterations will have to be conducted through the platform. Even more critical is the absence of the full range of analytics available to the creator once the content has been handed over to the host. The shortage of relevant data can inhibit strategic decisions.

Syndication

Video syndication is a much more complex and tedious process. Simply put, “When you syndicate your content, you embed the entire thing as a package—the content itself and a video player of your choosing.” The level of difficulty may be higher, but the control one retains and the design offerings are much more comprehensive and advertising decisions lie securely within the creator’s purview. A centralized hub for all of one’s media is also a key feature, such that once content is removed from one place, it disappears from all players, everywhere.

Video syndication, despite its wide array of advantages, is plagued with its own difficulties, and any content creator must be prepared by remaining vigilant of the following key issues:

1. Awareness of capability. The allure of syndicating to the every platform that will accept one’s content is hard to resist, especially with the goal of distributing content to as many people as possible. However, it does not take long for things to get unwieldy as far as enforcing complex business rules with advertisers and syndication partners. With the aim of maximizing revenue in mind, AdMonsters details the issues associated with fulfilling one’s agreements with advertisers as far as reformatting ads for the different screens and devices they are running on. Analytics and data discovery has been predominantly weighted to the syndicator/distributor though with the burgeoning organic growth in small to mid tier providers of content. It is becoming clear that the creators need analytics for creative purposes as well as control/contract use purposes.

2. Technological congruence. Online video syndication is constantly uncovering issues as it continues to develop. More recently, the lack of incompatible technologies utilized by those supplying the content and the distributing platforms have been cited as a glaring concern. TechTarget asserts that progress within this industry sector should move towards addressing the “need for a standardized data exchange mechanism, and the need for a standardized metadata vocabulary.”

3. Tracking and reporting. Syndication is hardly a hands-off process. On the contrary, AdMonsters proclaims that the inclusion and integration of syndication partners further necessitates an attentive eye and data driven approach to managing these partnerships. Content suppliers must be able to track how well the video media is performing as well as the revenue generation on each partner’s platform and adherence to contract use terms. In some ways content suppliers are trying to take back control of data tracking with new trends in blockchain technologies. Comprehensive and coherent reporting that is capable of tracking every aspect of one’s syndicated content is critical in framing an informed distribution strategy.

Denisse Perez, Content Marketing Analyst for True Interaction, contributed to this post.

by Joe Sticca

Categories
Insights

3 Issues with Data Management Tools

The market is currently awash with BI tools that advertise lofty claims regarding their ability to leverage data in order to ensure ROI. It is evident, however, that these systems are not created equally and the implementation of one could adversely affect an organization.

Cost

While consistent multifold increases of the digital universe is ushering in lower costs for data storage, a decline reported to be as much as 15-20 percent in the last few years alone, it is also the catalyst for the the rising cost of data management. It seems that the cause for concern regarding data storage does not lie in the storage technologies themselves, but in the increasing complexity of managing data. The demand for people with adequate skills within the realm of data management is not being sufficiently met, resulting in the need for organizations to train personnel from within. The efforts required to equip organizations with the skills and knowledge to properly wield these new data management tools demand a considerable portion of a firm’s time and money.

Integration

The increased capacity of a new data management system could be hindered by the existing environment if the process of integration is not handled with the proper care and supervision. With the introduction of a different system into a company’s current technological environment as well as external data pools( i.e. digital, social, mobile, devices, etc.), the issue of synergy between the old and new remains. CIO identifies this as a common oversight and advises organizations to remain cognizant of how data is going to be integrated from different sources and distributed across different platforms, as well as closely observe how any new data management systems operate with existing applications and other BI reporting tools to maximize insight extracted from the data.

Evan Levy, VP of Data Management Programs at SAS, shares his thoughts on the ideal components of an efficient data management strategy as well as the critical role of integration within this process, asserting that:

“If you look at the single biggest obstacle in data integration, it’s dealing with all of the complexity of merging data from different systems… The only reasonable solution is the use of advanced algorithms that are specially designed to support the processing and matching of specific subject area details. That’s the secret sauce of MDM (Master Data Management).”

Reporting Focus

The massive and seemingly unwieldy volume is one major concern amidst this rapid expansion of data, the other source of worry being that most of it is largely unstructured. Many data management tools offer to relieve companies of this issue by scrubbing the data clean and meticulously categorizing it. The tedious and expensive process of normalizing, structuring, and categorizing data does admittedly carry some informational benefit and can make reporting on the mass of data much more manageable. However, in the end, a lengthy, well-organized report does not guarantee usable business insight. According to research conducted by Gartner, 64% of business and technology decision-makers have difficulty getting answers simply from their dashboard metrics. Many data management systems operate mostly as a visual reporting tool, lacking the knowledge discovery capabilities imperative to producing actionable intelligence for the organizations that they serve.

The expenses that many of these data management processes pose for companies and the difficulties associated with integrating them with existing applications may prove to be fruitless if they are not able to provide real business solutions. Hence, data collection should not be done indiscriminately and the management of it conducted with little forethought. Before deciding on a Business Intelligence system, it is necessary to begin with a strategic business question to frame the data management process in order to ensure the successful acquisition and application of big data, both structured and unstructured.

Joe Sticca, Chief Operating Officer of True Interaction, contributed to this post.

By Justin Barbaro

Categories
Insights

Ensure Data Discovery ROI with Data Management

The explosion of data is an unavoidable facet of today’s business landscape. Domo recently released its fourth annual installment of its Data Never Sleeps research for 2016, illustrating the amount of data that is being generated in one minute on a variety of different platforms and channels. The astounding rate in which data has been growing shows no indication of slowing down, anticipating a digital universe saturated in nearly 44 trillion gigabytes of data by the year 2020. With data being produced in an increasingly unprecedented rate, companies are scrambling to set data management practices in place to circumvent the difficulties of being overwhelmed and eventually bogged down by the deluge of information that should be informing their decisions. There are a plethora of challenges with calibrating and enacting an effective data management strategy, and according to Experian’s 2016 Global Data Management Benchmark Report, a significant amount of these issues are internal.

Inaccurate Data

Most businesses strive for more data-driven insights, a feat that is rendered more difficult by the collection and maintenance of inaccurate data. Experian reports that 23% of customer data is believed to be inaccurate. While over half of the companies surveyed in this report attribute these errors to human error, a lack of internal manual processes, inadequate data strategies, and inadequacies in relevant technologies are also known culprits in the perpetuation of inaccurate data. While the reason for the erroneous input of data is still largely attributed to human oversight, it is the blatant lack of technological knowledge and ability that is barring many companies from leveraging their data, bringing us to our next point.

Data Quality Challenges

Inescapable and highly important, the sheer volume of information being generated by the second warrants a need for organizations to improve data culture. Research shows that businesses face challenges in acquiring the knowledge, skills, and human resources to manage data properly. This is reflective of organizations of all sizes and resources, not just large companies, as a baffling 94% of surveyed businesses admit to having experienced internal challenges when trying to improve data quality.

Reactive Approach

Experian’s data sophistication curve identifies four different levels of data management sophistication based on the people, processes, and technology associated with the data: unaware, reactive, proactive, and optimized. While the ultimate goal is ascending to the optimized level of performance, 24% of the polled businesses categorize their data management strategies as proactive, while the majority (42%) admits to merely reaching the reactive level of data management sophistication. The reactive approach is inefficient in many ways, a prominent one being the data management difficulties, both internal and external, espoused by waiting until specific issues with data crops up before addressing and fixing them as opposed to detecting and resolving such problems in a timely manner.

The most deleterious disadvantage of failing to address these pressing issues as they are detected is the careless neglect of invaluable business insight that is concealed in the mass of available data. Data management systems that have not optimized their operations will not be able to process data to produce relevant information in a timely manner. The lack of machine learning mechanisms within these sub-optimal systems will hinder businesses in their knowledge discovery process, barring organizations from making data-driven decisions in real time.

Denisse Perez, Content Marketing Analyst for True Interaction, contributed to this post.

by Joe Sticca

Categories
Insights

Engaging and Keeping Your Most Valuable Mobile Customers

Retention

According to WhaTech, some 80-90% of apps downloaded from app stores are only opened once before being uninstalled. This has been the norm for years. That’s quite the endorsement for a proper preload with the business requirement that the user cannot remove the app. Facebook has found only about 6% of apps it works with are still being used 30 days post install. KISSmetrics found that it can be up to 7x more expensive to acquire a new user than retain a current user. Acquisition campaigns are structured with retention in mind. This requires a keen understanding of the lifetime value (LTV) of one’s customer segments.

Who are your avid consumers? Which customers consistently purchases your higher tier content? Who consistently subscribes to your services? You can use Synaptik to segment these buckets. Over the course of the next three years, programatic will likely become the only way to engage them. Firms like Element Wave focus on user engagement and retention via real time, precision targeting at key moments during games or app navigation. They utilize native messaging and well timed push messages to drive users back to the app with incentives and contextual information such as how many persons in a geo-fenced region are placing a bet or attempting to bid on an item. Whether it’s gaming or non-gaming apps on Android or iOS, the latest study by Localytics shows Facebook is still the most valuable platform for targeting.

“Retention has transcended to becoming a highly objective metric to measure how users find your product or service,” says WhaTech. “Market research suggests that even a 5 percent increase in customer retention can lead to increasing profits by 25 percent to 125 percent.”

Customer Acquisition Cost

How much is your firm spending on acquisition costs for mobile ad campaigns? You need to look at cost per install and cost per registration as well as what your conversion rate is to get a customer to go from installing your app to registering to be a user. There’s a big drop off point here in mobileUX, which is why single sign on is huge. As a mobile product manager, I found that best way to convert installs to purchases was to deliver the experience of the application as a deferred sign-on without requiring initial registration. I often find myself deleting apps if the path to use is too cumbersome and requires too many fields to be filled out.

Obviously, the total cost per acquisition is a key figure, but we just want to make sure it’s less than the life time value of that acquisition. You decide what your margin needs to be. What’s your model look like for 2017? Let us know if we can be of assistance with the development of your mobile app or website. Do you need some help in getting your app to market or finding a way to get it preloaded onto the newest mobile handsets and tablets? Reach out. True Interaction has a team of mobile experts. At True Interaction, we stay ahead of the curve and help you do the same.

by David Sheihan Hunter Lindez

Categories
Insights

Digital Transformation Capability and the Modern Business Landscape

Yesterday morning, The Wall Street journal announced that Goldmann Sachs Group Inc. dropped out of R3CEV LLC blockchain group. R3 has been notable in its corralling of 70 different banks and financial firms to join their group since 2014, including Bank of America, J.P. Morgan and State Street. A spokesperson commented on the company’s departure:

Developing technology like this requires dedication and significant resources, and our diverse pool of members all have different capacities and capabilities which naturally change over time.

For the record, Goldmann Sachs will continue to invest in blockchain technology including the startups Circle and Digital Asset Holdings, but there is only speculation as to exactly why Goldmann Sachs’ membership with R3 expired. Certainly it may have been related to disagreement as to the equity distribution models between R3 and its members, but just a month earlier, when R3 announced their blockchain proof-of-concept prototype exercise, R3 CEO David Rutter commented:

Quality of data has become a crucial issue for financial institutions in today’s markets. Unfortunately, their middle and back offices rely on legacy systems and processes – often manual – to manage and repair unclear, inaccurate reference data.

The truth is, that there’s still quite a bit of latitude of digital capability across, within, and without businesses, big or small.

Getting the whole gang onboard

Perhaps Goldmann Sachs’ departure is due to exactly this: some aspect of their business units are behind the power curve in their digitization transformation and data management efforts.

Digital transformation can be a painstakingly complicated process, partially because, according to Computer Weekly, some parts of the transformation process aren’t even executed by the organization itself, yet still require all the vigilance their CIO and IT units can muster, being ultimately their responsibility:

Companies of all kinds are increasingly using technology partners, channel partners, contract manufacturers, warehousing and logistics partners, service partners and other outside services to handle all or part of a business process. Most enterprises come to view these partners as the extended enterprise, and look for ways to have tight integration and collaboration with them.

To achieve effective, successful transformation, digital business leaders must get their whole business ecosystem onboard with a clear, discernable, comprehensive strategic digital transformation plan that touches upon all of the extended enterprise. To act and assess digital transformation opportunity, McKinsey suggests 4 steps:

1. Estimate the value at stake. Companies need to get a clear handle on the digital-sales and cost-reduction opportunities available to them. Digital—and digitally influenced—sales potential should be assessed at the product level and checked against observed internal trends, as well as competitor performance. On the cost side, administrative and operational processes should be assessed for automation potential, and distribution should be rightsized to reflect digital-sales growth. The aggregate impact should be computed and turned into a granular set of digital targets to monitor progress and drive value capture.

2. Prioritize. Most organizations don’t have the ability, resources, or risk tolerance to execute on more than two or three big opportunities at any one time. Be selective. Figure out what areas are likely to deliver the greatest return on investment and the best customer outcomes and start there. While digital requires some experimentation, too many ad hoc demos and showcases lead to scattershot investments that fail to deliver sustained value. One retailer, for instance, ended up with 25 subscale digital offerings by not culling in the right places.

3. Take an end-to-end view. One financial-services firm built a world-class digital channel but failed to update the paper-based processes that supported it—processes that were prone to error. That false veneer of speed and efficiency eroded trust and turned off customers. The moral? Although it may seem counterintuitive, overinvestment in a slick front end that is not matched with the corresponding high-quality fulfillment that customers now expect may actually lead to increased customer frustration.

4. Align the business portfolio accordingly. In the long run, some lines of business will simply be destroyed by digital. Hanging on and tweaking them is futile. Companies need to act purposefully and divest where it makes sense, identifying what holdings are likely to be cannibalized or likely to underperform in the new environment and sloughing them off. Conversely, some areas will clearly need new capabilities and assets, which companies often do not have the luxury to build up organically over time. One retailer used targeted acquisitions to rapidly build out its e-commerce capabilities, allowing it to focus on defining strategy and aspirations rather than tinkering with the “plumbing.” Source.

Creating new monetizeable value

A recent report by Gartner revealed that often organizations are missing out on a bevy of monetizeable value due to overemphasizing traditional silos and markets (marketing, social media, mobile applications, etc.). A too-narrow focus means organizations are getting only a small share of the full value that digital transformation can provide. Saul Judah, research director at Gartner says,

All too often IT leaders focus value creation more narrowly, with the result that most digital initiatives are aimed at operational improvements, rather than value transformation. While this tactical approach to digital value can result in very real process and financial improvements, the greatest potential for digital value lies in more strategic initiatives, such as creating new markets, empowering employees, changing the basis of competition and crossing industry boundaries.

IT leaders need to work with the business side of the house to identify and exploit these high-value initiatives.

Algorithms and analytics offer accelerators of value and are themselves of exchangeable and monetizable value. An analytics process may use algorithms in its creation, which could also be monetizable through an algorithmic marketplace, making it available to enterprises of all types and sizes to use.

For example, True Interaction’s data agnostic machine learning analytics platform, SYNAPTIK, is rolling out a data marketplace where organizations can syndicate and distribute new data revenue opportunities and actions to their clients, as well as other platforms.

Digital transformation and the modern enterprise landscape

The Blockchain endgame?

Blockchain technology offers several benefits to an organization. The technology uses new methods of encryption which enables anonymous sharing of information in a data-rich environment. They are further characterised as Smart Contracts, computer protocols that facilitate, verify, or enforce the negotiation of contract cases or terms. And with blockchain, the dataset remains updated and intact at all times, without the need or use of a central governing authority.

Decentralised systems using blockchain technology can manage the data relationships and sequence of events where all parties share the same data source. Furthermore, with the impending intersection of the Internet of Things with blockchain technology, digitally tenacious organizations will soon be able to connect, conceivably, anything with anything, and get them to communicate intelligently and securely. Enterprises that embrace this phenomenon will be able to provide a better user experience and value-added services, as well as gain competitive advantage and differentiation.

Seeing the rumble between Goldmann Sachs and R3 shows us that we are still a ways off as far as describing the exact standards of blockchain in business. With certainty, some markets need to topple age-old paradigms of strategic thinking that are no longer relevant in a digital world. But the promise is quite exciting.

by Michael Davison

Categories
Insights

Wrangling Data for Compliance, Risk, and Regulatory Requirements

N.B. This article addresses the financial services industry, however, the insight and tips therein are applicable to nearly any industry today. ~EIC)

The financial services industry has always been characterized by its long list of compliance, risk, and regulatory requirements. Since the 2008 financial crisis, the industry is more regulated than ever, and as organizations undergo digital transformation and financial services customers continue to do their banking online, the myriad of compliance, risk, and regulatory requirements for financial institutions will only increase from here. In a related note, organizations are continuing to invest in their infrastructure to meet these requirements. IDC Financial Insights forecasts that the worldwide risk information technologies and services market will grow from $79 billion in 2015 to $96.3 billion in 2018.

All of this means reams of data. Financial firms by nature produce enormous amounts of data, and due to compliance requirements, must be able to store and maintain more data than ever before. McKinsey Global Institute reported in 2011 that the financial services industry has more digitally stored data than any other industry.

To succeed in todays financial industry, organizations need to take a cumulative, 3-part approach to their data:

1. Become masters at data management practices.

This appears obvious, but the vast amount of compliance, risk, and regulatory requirements necessitate that organizations become adept at data management. Capgemini identified 6 aspects to data management best practices:

Data Quality. Data should be kept optimal through periodic data review, and all standard dimensions of data quality– completeness, conformity, consistency, accuracy, duplication, and integrity must be demonstrated.

Data Structure. Financial services firms must decide whether their data structure should be layered or warehoused. Most prefer to warehouse data.

Data Governance. It is of upmost importance that financial firms implement a data governance system that includes a data governance officer that can own the data and monitor data sources and usage.

Data Lineage. To manage and secure data appropriately as it moves through the corporate network, it needs to be tracked to determine where it is and how it flows.

Data Integrity. Data must be maintained to assure accuracy and consistency over the entire lifecycle, and rules and procedures should be imposed within a database at the design stage.

Analytical Modeling. An analytical model is required to parcel out and derive relevant information for compliance.

2. Leverage risk, regulatory, and compliance data for business purposes.

There is a bright side to data overload; many organizations aren’t yet taking full advantage of the data they generate and collect. According to PWC, leading financial institutions are now beginning to explore the strategic possibilities of the risk, regulatory, and compliance data they own, as well as how to use insights from this data and analyses of it in order to reduce costs, improve operational efficiency, and drive revenue.

It’s understandable that in today’s business process of many financial institutions, the risk, regulatory, and compliance side of the organization do not actively collaborate with the sales and marketing teams. The tendency toward siloed structure and behavior in business make it difficult to reuse data across the organization. Certainly an organization can’t completely change overnight, but consider these tips below to help establish incremental change within your organization:

Cost Reduction: Eliminate the need for business units to collect data that the risk, regulatory, and compliance functions have already gathered, and reduce duplication of data between risk regulatory, compliance, and customer intelligence systems. Avoid wasted marketing expenses by carefully targeting marketing campaigns based upon an improved understanding of customer needs and preferences.

Increased Operational Efficiency: Centralize management of customer data across the organization. Establish a single source of truth to improve data accuracy. Eliminate duplicate activities in the middle and back office, and free resources to work on other revenue generating and value-add activities.

Drive Revenue: Customize products based upon enhanced knowledge of each customer’s risk profile and risk appetite. Identify new customer segments and potential new products through better understanding of customer patterns, preferences, and behaviors. Enable a more complete view of the customer to pursue cross-sell and up-sell oppportunities.

3. Implement a thorough analytics solution that provides actionable insight from your data.

Today, it’s possible for financial organizations to implement an integrated Machine Learning component that runs in the background, that can ingest data of all types from any number of people, places, and platforms, intelligently normalize and restructure it so it is useful, run a dynamic series of actions based upon data type and whatever specified situations contexts your business process is in, and create dynamic BI data visualizations out-of-the-box.

Machine Learning Platforms like SYNAPTIK enable organizations to create wide and deep reporting, analytics and machine learning agents without being tied to expensive specific proprietary frameworks and templates, such as Tableau. SYNAPTIK allows for blending of internal and external data in order to produce new valuable insights. There’s no data modeling required to drop in 3rd party data sources, so it is even possible to create reporting and insight agents across data pools.

By Michael Davison

Categories
Insights

Robot Advocates: When Consumer Products Advocate on the Customer’s Behalf

The Situation

Recently, True Interaction lead backend developer Darrik Mazey had experienced some troubling hardware issues – this time, it was outside of his hectic True Interaction project schedule.

His family’s 2-year old LG refrigerator went down, requiring multiple visits by the repairman to diagnose and fix, pulling him away from professional commitments, and generally making life annoying as he tried in vain to keep his family’s perishables from spoiling.

Leveraging Social Media, the ‘Old Fashioned’ Way

During this process, he did learn the benefits of leveraging social media to improve his customer service experience with LG.

When Darrik began tweeting about the fridge, LG offered a $50 food reimbursement, and extended his warranty. But this was just the beginning of further product woes, which eventually became the catalyst for an awesome solution.

Problems with the refrigerator persisted after the first “repair” job, and Darrik eventually had to rent another fridge while further diagnosis continued. In all, he estimated the cost due to missed work and equipment rental to be over $1500, no small amount for a hard-working family man.

Implementing a Real-time Solution with Xymon

Being the veteran coder that he is, Darrik resolved to ensure that, should the hardware fail again, he would be informed about it immediately via Xymon. In a nutshell, Xymon monitors hosts, network services, and anything else you might configure it to do via extensions. It can periodically generate requests to network services — http, ftp, smtp and so on — and record if the service is responding as expected.

In a recent personal blog post, Darrik recounts his refrigerator woes, and how he configured a controlbyweb temperature module, connected it to the fridge, networked it via wireless AP and monitored the fridge and freezer temperatures via Xymon 4.3.10.

Darrik monitors his Xymon network via its own Twitter account which conveniently tweets him with status updates, and / or if something is awry. Now Darrik receives tweets as soon as the fridge goes above 40 degrees.

Maximizing the Value of Social Media in the SPIME Era

Learning from his recent foray into using social media to improve his customer service experience, he’s now configured the system so that — should the LG fridge malfunction again — it will @LGUS, LG’s official Twitter account, with a status message, the current temperature inside the fridge, and a #fail hashtag.

We are living in an age where, with a little work and ingenuity, products can be configured to actually hold their parent company accountable — in public — regarding their quality, and the customer service surrounding them. We’re nearly touching upon what futurist Bruce Sterling (who recently shared his take on Google Glass) describes as the “spime era“. Sterling uses the term SPIME for connected, information rich objects that operate and communicate in an internet of things.

Since the onset of Friendster in 2002, social media has evolved from a form of informal communication, to the hottest marketing platform since TV advertisements, to a very viable channel for consumers to elicit and receive customer service. Certainly entrepreneurs and small business owners need to understand that their products and services live and die by consumer sentiment (robotic or otherwise) on social media. While this sounds like a precarious situation, the flipside is that with not too much effort, entrepreneurs and SMBs can preemptively mine social media regarding their products and services in order to unearth incredibly valuable intelligence. This includes discovering not only general consumer sentiment, but also other perspectives on their products and services they might never have considered, such as creative suggestions on improvement and unique new use cases, as well as using social media as a dragnet for discovering bugs, service gaps, or product flaws.

With that in mind, entrepreneurs and SMB owners should establish their social media intelligence network as soon and as thoroughly as possible in order to monitor relevant SM channels and get into the trenches to interface directly with their consumers. Utilize customers for the intelligence that they can provide, and reward them – even if it’s just recognition or a “thank you” – when they provide insight on the product or service, even if it’s not directly to the organization. When businesses step in, get involved and interact, and consumers discover that they are listening and care about their opinion, they’ve just converted a customer for the long-term.

The Future

So what’s next? What happens when consumers can make decisions about the products they choose to purchase, based upon products themselves sharing their real-time data on a social media channel? What happens when brands are judged by their ability to manage the self-communicating products they’ve birthed? And considering all this, where will manufacturer responsibility for their products end or extend in the near future? All of this is fascinating and mind-boggling stuff to ponder for a moment. I’d love to hear your thoughts.

by Michael Davison

Categories
Insights

How Blockchain will Transform Media and Entertainment

I recently blogged about blockchain where I described the technology and why it could be a revolutionary development, and also explored its various applications across a number of industries. Today, let’s take a closer look at blockchain in entertainment and media: its various applications, how it is transforming the industry, and its implications for the near future.

Quick Recap

A blockchain is type of data structure that can be purposed to create a digital ledger of transactions that can be shared among a distributed network of computers. By using cryptography, each account on the network may access and manipulate the ledger securely, without the need for any central authority or middleman – this is the supreme concept to take away. Blockchain is:

Reliable and available.
Because a wide circle of participants share a blockchain, it has no single point of failure and is designed to be resilient in the face of outages or attacks. If any node in a network of participants fails, the others will continue to operate, maintaining the information’s availability and reliability.

Transparent. Transactions on the blockchain are visible to its participants, increasing auditability and trust.

Immutable. It is nearly impossible to make changes to a blockchain without detection, increasing confidence in the information it carries and reducing the opportunities for fraud.

Irrevocable. It is possible to make transactions irrevocable, which can increase the accuracy of records and simplify back-office processes.

Digital. Almost any document or asset can be expressed in code and encapsulated or referenced by a ledger entry, meaning that blockchain technology has very broad applications, most as yet unimagined, much less implemented.

Given these characteristics, blockchain technology will likely be a catalyst for transformative innovation in nearly every industry.

Blockchain as purposed in Entertainment and Media

Payments

One of the most obvious applications of blockchain in the media is its ability to support micropayments that can be processed without the need for an intermediary payment network or its fees. Generally speaking, without blockchain, intermediary payment fees are too cost-prohibitive to enable micro-payments of values less than $1. Chris Dixon explained this wonderfully in an an article by venture capitalist Marc Andreessen:

“Let’s say you sell electronics online. Profit margins in those businesses are usually under 5 percent, which means conventional 2.5 percent payment fees consume half the margin. That’s money that could be reinvested in the business, passed back to consumers or taxed by the government. Of all of those choices, handing 2.5 percent to banks to move bits around the Internet is the worst possible choice. Another challenge merchants have with payments is accepting international payments. If you are wondering why your favorite product or service isn’t available in your country, the answer is often payments.”

Ostensibly very little would change from a consumer standpoint; consumers are used to purchasing songs for $.99. But behind the transactional curtain, everything would be transformed.

Without a middleman, when digital objects can be cryptographically associated with their creators, then there is no need for distribution channels. Think about that for a moment. This technology can put all kinds of media directly in the control and management of their creator, obviating the need for iTunes, Netflix, or Amazon. Or record labels. Or publishing houses. Blockchain has the potential to erode away the financial paradigms that conglomerate media companies have been using for the past century. Says Bruce Pon:

Imagine a future where creators upload their content to Facebook. There’s a “Buy” button on the bottom right corner. A consumer clicks it, and in a split second, the content is licensed to them, payment flows in the opposite direction to the creator, and the transaction is recorded on the blockchain.

Today, we are already seeing startups that are exploring new payment models through blockchain technology that are focused upon bringing more value to content creators. Ujo, in their own words, is a “home for artists that allows them to own and control their creative content and be paid directly for sharing their musical talents with the world.” The Ujo platform uses blockchain technology “to create a transparent and decentralized database of rights and rights owners, and automates royalty payments using smart contracts and cryptocurrency,” says Phil Barry, founding partner at Edmund Hart, which oversees Ujo. “We hope that it will be the foundation upon which a new more transparent, more efficient and more profitable music ecosystem can be built.”

Scarcity

Generally digital objects can lose value because they are easily copied. We see this especially in the area of pirated music, movies and TV. But because blockchain makes it possible for creators to register origin of work and set sharing permissions, structure the means of exchange that they’re willing to accept, it is possible create conditions for “digital scarcity”.

Consider a situation where an artist creates a piece of music in .mp3 format, and programs the ability for only 1000 people to listen to it, with the price of the .mp3 increasing with each new listener.

Relationship to consumer

Because blockchain precludes the need for a middleman, the technology creates new opportunities for large corporations to get closer to their customers and consumers. Because the playing field allows for individual creators to connect with their consumers directly, the onus on the bigger media companies is to operate more nimbly, and to offer more varied and interactive pricing models for their content based upon every individual consumer’s actions and purchases. And because provenance, payment, and distribution becomes simpler and less expensive to manage with blockchain, bigger media companies can concentrate more on creating quality content itself.

Reflection

While we haven’t seen it just yet, blockchain technology promises to transform a myriad of industries – especially media and entertainment. To the media consumer, it likely means more access to more content, from more creators, on a much more personal, secure, and granular level. For content creators, it means a much simpler attribution, payment, and distribution system, and the ability to be creative with payment models and the concept of “digital scarcity”. It will be exciting to see what happens.

If you are an SMB owner and want to learn more about blockchain, check out my first post on the subject. And by all means, get in touch with an expert. The time is now to begin exploring implementation of blockchain technology in your business. Consider these statistics:

– A billion dollars in venture capital has flowed to more than 120 blockchain-related startups, with half that amount invested in the last 12 months.1
– Thirty of the world’s largest banks have joined a consortium to design and build blockchain solutions.2
– Nasdaq is piloting a blockchain-powered private market exchange. 3
– Microsoft has launched cloud-based blockchain-as-a-service. 4

We’ve got the Experience you Need
True Interaction has logged 100,000’s of hours of research, design, development, and deployment of consumer products and enterprise solutions. Our services directly impact a variety of industries and departments through our deep experience in producing critical business platforms.

We Can Integrate Anything with… Anything

Per the endless variable demands of each of our global clients, TI has seen it all, and has had to do it all. From legacy systems to open source, we can determine the most optimal means to achieve operational perfection, devising and implementing the right tech stack to fit your business. We routinely pull together disparate data sources, fuse together disconnected silos, and do exactly what it takes for you to operate with tight tolerances, making your business engine hum. Have 100+ platforms? No problem. Give us a Call.

By Michael Davison

Categories
Insights

The Changing Terrain of Media in the Digital Space

The rapid digitization of the media industry does not merely address the immediate needs posed by the market, but also anticipates the constantly changing consumer behavior and rising expectations of an increasingly digital customer. The World Economic Forum points to a growing middle class, urbanization, the advent of tech savvy millennials demanding instantaneous access to content on a variety of platforms, and an aging world population that is invariably accompanied by the need for services designed for an older audience as the most pronounced demographic factors that are currently contributing to the reshaping of the media landscape. The expanding list of accommodations that customers are coming to expect from the media industry more or less fall within the realms of the accessibility and personalization of content.

The Path to Digital Transformation

Average weekday newspaper circulation has been on a steady decline, falling another 7% in 2015 according to the most recent Pew Research Center report. This inevitable dwindling of interest in print publications could be ascribed to the rising demand for media companies to adopt a multi-channel strategy that enables the audience to access content across different platforms. Companies remedy their absence of a formidable digital presence in a variety of ways. One of the most common resolutions that companies have resorted to involve redesigning their business model by bundling print subscriptions with mobile device access, a measure enacted to address the 78% of consumers who view news content on mobile browsers. A more radical approach could be opting for a complete digital transformation, a decision reached by The Independent earlier this year when it became the “first national newspaper title to move to a digital-only future.” The appeal of having information become readily available on any screen of the customer’s choosing is magnified by the expectation of uniformity and equally accessible and engaging user interfaces across all devices. Of course, convenience to the customer does not only rely on their ability to access content on the platform of their choice, but also at any point they desire, hence the focus on establishing quick response times and flexibility of content availability.

Another expectation that consumers have come to harbor aside from unhindered access to content: the minimization, if not the complete elimination of superfluous information. According to the 2016 Digital News Report by the Reuters Institute, news organizations, such as the BBC and the New York Times, are striving to provide more personalized news on their websites and applications. In some cases, people are offered information and clips on topics in which they have indicated an interest. Additionally, companies are also employing a means of developing “auto-generated recommendations based in part on the content they have used in the past.” Transcending written material, streaming platforms like Pandora and Netflix utilize Big Data in order to analyze and discern the characteristics and qualities of an individual’s preferences, thus feeding information into a database that then determines content using predictive analytics that the individual would be predisposed to enjoying. In previous blog posts, we have divulged the value of understanding Big Data, emphasizing how execution based on the insight gleaned from Big Data could be as crucial to a company’s profitability as the insight itself. As evidenced by this growing practice of collecting consumer data in order to cultivate personalized content for consumers, it is obvious that the media industry has not been remiss in its observation of the discernible success that data-driven companies boast relative to competitors that are not as reliant on data. Finally, perhaps as equally satisfying as being able to browse through personalized, recommended content based on one’s past likes and preferences is the exclusion of repetitive content, as informed by one’s viewing history.

Media companies embrace their ascent into digital space in a plethora of ways. Some elect for a complete digital transformation, conducting a substantial part if not all of their business within browsers and applications rather than in print. There are also those that focus on enhancing the customer experience by maintaining contact with consumers through all touch points and following them from device to device, all the while gathering data to be used in optimizing the content provided. Another means through which media companies are realizing their full digital potential is through the digitizing of their processes and operations. These businesses are initiating a shift towards digital products; a decision that is both cost-effective (cutting costs up to 90% on information-intensive processes) and can bolster the efficacy of one’s data mining efforts. Warner Bros was one of the first in the industry to transform the ways of storing and sharing content into a singled, totally integrated digital operation that began with Media Asset Retrieval System (MARS). This innovative digital asset management system ushered in a transformation that effectively lowered Warner Bros’ distribution and management costs by 85%.

A Glimpse into the Future

So what’s next in this journey to digital conversion? According to the International News Media Association (INMA), all roads lead to the Internet of Things (IoT). By 2018, the Business Insider Intelligence asserts that more than 18 billion devices will be connected to the Web. The progression into this new era of tech where information can be harvested from the physical world itself will not go unobserved by the media industry. Media companies are tasked with having to evolve beyond the screen.

Mitch Joel, President of Mirium Agency, writes:

“Transient media moments does not equal a strong and profound place to deliver an advertising message… the past century may have been about maximizing space and repetition to drive brand awareness, but the next half century could well be about advertising taking on a smaller position in the expanding marketing sphere as brands create loyalty not through impressions but by creating tools, applications, physical devices, true utility, and more robust loyalty extensions that makes them more valuable in a consumer’s life.”

Big Data anchors the efforts into the Digital Age and the IoT will provide new, vital networks of information to fortify this crusade.
Contact our team to learn more about how True Interaction can develop game-changing platforms that cut waste and redundancy as well as boost margins for your media company.

By Justin Barbaro

Categories
Insights

The Decade of Blockchain?

“Blockchain technology” – it’s been buzzing all over the financial news recently if you haven’t noticed. The term still hasn’t quite exploded onto the public square yet like Bitcoin, the notorious first implementation of the technology, but a quick Google Trend search since the first published Bitcoin whitepaper in 2008 clearly shows a recent outburst of interest in “blockchain” since the end of 2015.

So what is blockchain technology anyway?

A blockchain is type of data structure that can be purposed to create a digital ledger of transactions that can be shared among a distributed network of computers. By using cryptography, each account on the network may access and manipulate the ledger securely, without the need for any central authority or middleman.

Why is blockchain important to me and my business?

As businesses continue to evolve into true digital, industries are awash in a deluge of big data; and two common challenges across SMBs continue to mount:

1. As big data’s volume increases, the ability to process and manipulate that data quickly and efficiently becomes compromised.

2. The sheer size and breadth of data in all of its environments, lifecycles, and formats increases the challenge of keeping business data secure.

Blockchain technology addresses these challenges directly in two ways:

First, with blockchain technology, there is increased transparency, accurate tracking, and a permanent, secure leger. Once a block of data is recorded on the blockchain ledger, it’s extremely difficult to change or remove. When someone wants to add to it, participants in the network — all of which have copies of the existing blockchain — run algorithms to evaluate and verify the proposed transaction.

Second, because blockchain technology obviates the need for a central authority, middleman, or clearing house, transactions can be run and approved automatically in seconds or minutes. This reduces costs and boosts efficiency.

Blockchain will make the financial services industry’s infrastructure much less expensive. The Block Chain Protocol allows the instant transfer of value, irrespective of size. Faster, cheaper settlements could trim billions of dollars from transaction costs while improving transparency.

But blockchain technology’s application isn’t limited to the financial industry – its uses are endless:

Automotive: Consumers can use the blockchain to manage fractional ownership in autonomous cars.

Voting: Using a blockchain code, constituents can cast votes via smartphone, tablet or computer, resulting in immediately verifiable results.

Healthcare: Patients’ encrypted health information can be shared with multiple providers without the risk of privacy breaches.

Source: Financial Services Technology 2020 and Beyond: Embracing disruption (PWC)

Meanwhile we are beginning to see a number of developers building APIs on the Block Chain Protocol, across a variety of applications:

– API’s to allocate digital resources such as energy, bandwidth, storage, and computation to the connected devices and services that need them. Eg; FileCoin

– API’s for Oculus Rift: With access to the virtual world now becoming TRON-esque, developers are looking at creating API’s that can be used in the virtual space to make transactions, blurring the lines between virtual and real economies.

– Micropayment API’s tailored to the type of transaction being undertaken. i.e: Tipping a blog versus Tipping a car share driver. Very useful in a shared economy where consumers increasingly become prosumers.

Source: WIRED

Blockchain: The time is now for progressive businesses.

As I mentioned earlier, Blockchain is still on the cusp of public consciousness. In a recent survey by PWC, 56% of survey respondents recognise the importance of blockchain technology, but 57% say they are unsure about or unlikely to respond to this trend. I wrote in an earlier blog post that progressive SMBs are making huge gains, and the race is already on. According to PwC’s 19th Annual Global CEO Survey, 81% of banking CEOs are concerned about the speed of technological change, more than any other industry sector.

The Wall Street Journal recently reported that in the last year, more than 40 financial institutions said they were working with blockchain. Other sources detail that in 2015, 13 blockchain companies obtained over $365 million in funding, and by the beginning of this year, blockchain companies had raised well over a billion dollars to fund their development and operations.

SMBs should realize that blockchain technology is not just for the globals and multinationals. It is applicable to any sized business and can be scaled accordingly. The technology has reached a stage that some businesses are even experimenting with establishing smaller, “private blockchains” within their own offices, or are exploring how they can deploy their own blockchain on smaller “permissioned” networks.

The time is now for research and analysis – even professional consultation on the full extent of its application in your business. Certainly it’s a complex technology, with several yet-to-be determined regulatory implications, and as always, there are the usual difficulties with implementation, and sorting through the swath of competing vendors and platforms. But with a clear strategy of where, why, and how to apply the technology, you will be on the right road to incorporating blockchain into the framework of your business.

By Michael Davison