Transform & Innovate Legacy Technology

Resources ($) spent treading water

Transforming legacy technology remains a difficult proposition. A lack of historical expertise by current stakeholders is one impediment, as is transparency of soft costs within the technology budget. The full expenses of legacy infrastructure can remain hidden until confronted with a significant technology outage, or the sudden increase in the cost of maintenance. The average costs and resources of maintaining legacy systems and application averages approximately 30% of an organization’s IT budget, which include:

  • Maintenance
  • Talent (Human capital)
  • Internal and external compliance (i.e. GDRP, etc.)
  • Risk; Security threats and trends
  • Agility, scalability and stability

One of the most important factors in dealing with transforming legacy technology is facing the reality of your organization’s culture and alignment.

Where to start…

Begin by drawing a line around your monolithic systems, code and infrastructure. Often companies believe they must reengineer all their legacy systems from the ground up. This is a common error, however, it is critical to delineate which portions of the system can be inoculated or identified as ‘core’ and then build API or structures around that “core”.

Develop an integration strategy and then construct an integration layer. This means some code will be written from the foundational or infrastructure level, then to the database layer and finally to the user experience environment. It is critical to identify those systems which can be detethered and then “frozen”. This facilitates a phased integration approach, upon which additional functionality can be layered. Depending on the complexity of legacy architecture, these changes may be cost prohibitive, and so the ability to isolate, freeze and use a layered-build approach is an appropriate solution. This will permit an organization to stabilize their applications code and then build APIs or other integration layers around ‘frozen’ areas of the technology stack. In some circumstances, Block-chain can be very useful in providing a fast and simple way to place in an integration layer within the  legacy or ‘frozen’ environments.

Missing Link

The most important component of transformation and innovation is the people within the organization, not the technology or skillsets around the technology.  Industry studies indicate a potential for 20-30% increase in productivity and creative thought if the individuals are engaged and aligned with the organization’s goals, and the change processes align with individual goals and performance.  All departments and stakeholders must be in alignment from product, QA, development, and infrastructure to the end users. This is the most important aspect of any technology transformation initiative;  creating a safe and collaborative environment to facilitate “creative dissent’.

Can Artificial Intelligence Catalyze Creativity?

In the 2017 “cerebral” Olympic games, artificial intelligence defeated the human brain in several key categories. Google’s AlphaGo beat the best player of Go, humankind’s most complicated strategy game; algorithms taught themselves how to predict heart attacks better than the AHA (American Heart Association); and Libratus, an AI built by Carnegie Mellon University, beat four top poker players at no-limit Texas Hold ‘Em. Many technologists agree that computers will eventually outperform humans on step-by-step tasks, but when it comes to creativity and innovation, humans will always be a part of the equation.

Inspiration, from the Latin inspiratus, literally means “breathed into.” It implies a divine gift – the aha moment, the lightning bolt, the secret sauce that can’t be replicated. Around the globe, large organizations are attempting to reculture their companies to foster innovation and flexibility, two core competencies needed to survive the rapid-fire rate of change. Tom Agan’s HBR article titled “The Secret to Lean Innovation” identified learning as the key ingredient, while Lisa Levey believes that seeing failure as a part of success is key.

At the same time, although innovation is a human creation, machines do play a role in that process. Business leaders are using AI and advanced business intelligence tools to make operations more efficient and generate higher ROI, but are they designing their digital ecosystems to nurture a culture of innovation? If the medium is the message, then they should be.

“If you want to unlock opportunities before your competitors, challenging the status quo needs to be the norm, not the outlier. It will be a long time if ever before AI replaces human creativity, but business intelligence tools can support discovery, collaboration and execution of new ideas.” – Joe Sticca, COO at Synaptik

So, how can technology augment your innovation ecosystem?

Stop

New business intelligence tools can help you manage innovation, from sourcing ideas to generating momentum and tracking return on investment. For instance, to prevent corporate tunnel vision, you can embed online notifications that superimpose disruptive questions on a person’s screen. With this simple tool, managers can help employees step outside the daily grind to reflect on the larger questions and how they impact today’s deliverable.

Collaborate

The market is flooded with collaboration tools that encourage employees to leverage each other’s strengths to produce higher quality deliverables. The most successful collaboration tools are those that seamlessly fit into current workflows and prioritize interoperability. To maximize innovation capacity, companies can use collaboration platforms to bring more diversity to the table by inviting external voices including clients, academics and contractors into the process.

Listen

Social listening tools and sentiment analysis can provide deep insights into the target customer’s needs, desires and emotional states. When inspiration strikes, innovative companies are able to prototype ideas quickly and share those ideas with the digital universe to understand what sticks and what stinks. By streamlining A/B testing and failing fast and often, agile companies can reduce risk and regularly test their ideas in the marketplace.

While computers may never birth the aha moments that drive innovation, advanced business intelligence tools and AI applications can capture sparks of inspiration and lubricate the creative process. Forward-thinking executives are trying to understand how AI and advanced business intelligence tools can improve customer service, generate higher ROI, and lower production costs. Companies like Cogito are using AI to provide real-time behavioral guidance to help customer service professionals improve the quality of their interactions while Alexa is using NLP to snag the full-time executive assistant job in households all over the world.

Creativity is the final frontier for artificial intelligence. But rather than AI competing against our innovative powers, business intelligence tools like Synaptik can bolster innovation performance today. The Synaptik difference is an easy user interface that makes complex data management, analytics and machine learning capabilities accessible to traditional business users. We offer customized packages that are tailored to your needs and promise to spur new ideas and deep insights.

By Nina Robbins

Sparking Digital Communities: Broadcast Television’s Answer to Netflix

In the late 1990s and early 2000s network television dominated household entertainment. In 1998, nearly 30% of the population in the United States tuned into the NBC series finale of “Seinfeld”. Six years later, NBC’s series finale of the popular sitcom “Friends” drew 65.9 million people to their television screen, making it the most watched episode on US network TV in the early aughts. Today, nearly 40% of the viewers that tuned into the “Game of Thrones” premier viewed the popular show using same-day streaming services and DVR playback. The way people watch video content is changing rapidly and established network television companies need to evolve to maintain their viewership.

While linear TV is still the dominant platform amongst non-millenials, streaming services are quickly catching up. As young industry players like Hulu, Netflix and Youtube transform from streaming services to content creators and more consumers cut ties with cable, established network broadcasters need to engage their loyal audience in new ways. The challenge to stay relevant is further exacerbated by market fragmentation as consumer expectations for quality content with fewer ad breaks steadily rise.


Courtesy of Visual Capitalist

One advantage broadcast television still has over streaming services is the ability to tap into a network of viewers watching the same content at the same time. In 2016, over 24 million unique users sent more than 800 million TV related tweets. To stay relevant, network television companies are hoping to build on this activity by making the passive viewing experience an active one. We spoke with Michelle Imbrogno, Advertising Sales Director at This Old House about the best ways to engage the 21st century audience.

“Consumers now get their media wherever and whenever it’s convenient for them. At “This Old House”, we are able to offer the opportunity to watch our Emmy Award winning shows on PBS, on thisoldhouse.com or youtube.com anytime. For example, each season we feature 1-2 houses and their renovations. The editors of the magazine, website and executive producer of the TV show work closely together to ensure that our fans can see the renovations on any platforms. We also will pin the homes and the items in them on our Pinterest page. Social media especially Facebook resonates well with our readers.“– Michelle Imbrogno, Advertising Sales Director, This Old House

Social media platforms have become powerful engagement tools. According to Nielsen’s Social Content Ratings in 2015, 60% of consumers are “second screeners” – using their smartphones or tablets while watching TV. Many “second screeners” are using their devices to comment and interact with a digital community of fans. Games, quizzes and digital Q & A can keep viewers engaged with their favorite programming on a variety of platforms. The NFL is experimenting with new engagement strategies and teamed up with Twitter in 2016 to livestream games and activate the digital conversation.

“There is a massive amount of NFL-related conversation happening on Twitter during our games and tapping into that audience, in addition to our viewers on broadcast and cable, will ensure Thursday Night Football is seen on an unprecedented number of platforms.”-NFL Commissioner Roger Goodell ,”

With social media optimization (SMO) software, television networks can better understand their audience and adjust their social media strategy quickly. Tracking website traffic and click rates simply isn’t enough these days. To stay on trend, companies need to start tracking new engagement indicators using Synaptik’s social media intelligence checklist:

Step 1: Integrate Social Listening Tools

The key to understanding your audience is listening to what they have to say. By tracking mentions, hashtags and shares you can get a better sense of trending topics and conversations in your target audience. Moreover, this knowledge can underpin your argument for higher price points in negotiations with media buyers and brands.

Step 2: Conduct a Sentiment Analysis

Deciphering a consumer’s emotional response to an advertisement, character or song can be tricky but sentiment analysis digs deeper using natural language processing to understand consumer attitudes and opinions quickly. Additionally, you can customize outreach to advertisers based on the emotional responses they are trying to tap into.

Step 3: Personality Segmentation

Understanding a consumer’s personality is key to messaging. If you want to get through to your audience you need to understand how to approach them. New social media tools like Crystal, a Gmail plug-in, can tell you the best way to communicate with a prospect or customer based on their unique personality. This tool can also help you customize your approach to media buyers and agents.

By creating more accessible content for users and building a digital community around content, television networks can expect to increase advertising revenue and grow their fan base. With Synaptik’s social listening tools, companies have the advantage to track conversations around specific phrases, words, or brands. Sign up for a 30 minute consultation and we can show you what customers are saying about your products and services across multiple social media channels online (Facebook, Twitter, LinkedIn, etc.).

Contributors:

Joe Sticca, Chief Operating Officer at True Interaction

Kiran Prakash, Content Marketing at True Interaction

by Nina Robbins

Big Data – The Hot Commodity on Wall Street

Imagine – The fluorescent stock ticker tape speeding through your company stats – a 20% increase in likes, 15% decrease in retail foot traffic and 600 retweets. In the new economy, net worth alone doesn’t determine the value of an individual or a business. Social sentiment, central bank communications, retail sentiment, technical factors, foot traffic and event based signals contribute to the atmospheric influence encasing you company’s revenue.

NASDAQ recently announced the launch of the “NASDAQ Analytics Hub” – a new platform that provides the buy side with investment signals that are derived from structured and unstructured data, and unique to Nasdaq. Big Data is the new oil and Wall Street is starting to transform our crude data material into a very valuable commodity.

What does this mean for the future of business intelligence?

It means that businesses that have been holding on to traditional analytics as the backbone of boardroom decisions must evolve. Nasdaq has pushed big data BI tech squarely into the mainstream. Now, it’s survival of the bittest.

An early majority of businesses have already jumped onto the Big Data bandwagon, but transformation hasn’t been easy. According to Thoughtworks, businesses are suffering from “transformation fatigue – the sinking feeling that the new change program presented by management will result in as little change as the one that failed in the previous fiscal year.” Many companies are in a vicious cycle of adopting a sexy new data analytics tool, investing an exorbitant amount of time in data prep, forcing employees to endure a cumbersome onboarding process, getting overwhelmed by the complexity of the tool, and finally, giving up and reverting to spreadsheets.


“There is a gap and struggle with business operations between spreadsheets, enterprise applications and traditional BI tools that leave people exhausted and overwhelmed, never mind the opportunities with incorporating alternative data to enhance your business intelligence processes.”
– Joe Sticca COO TrueInteraction.com – Synaptik.co

Now, the challenge for data management platforms is to democratize data science and provide self-service capabilities to the masses. Luckily, data management platforms are hitting the mark. In April, Harvard Business Review published results of an ongoing survey of Fortune 1000 companies about their data investments since 2012, “and for the first time a near majority – 48.4% – report that their firms are achieving measurable results for their big data investments, with 80.7% of executives characterizing their big data investments as successful.”

As alternative data like foot traffic and social sentiment become entrenched in the valuation process, companies will have to keep pace with NASDAQ and other industry titans on insights, trends and forecasting. Synaptik is helping lead the charge on self-service data analytics. Management will no longer depend on IT teams to translate data into knowledge.

Now, with the progression of cloud computing and easy to use data management interfaces with tools like Synaptik, your able to bring enterprise control of your data analytics processes and scale into new data science revenue opportunities.” – Joe Sticca COO TrueInteraction.com – Synaptik.co

Synaptik’s fully-managed infrastructure of tools makes big-data in the cloud is fast, auto-scalable, secure and on-demand when you need it. With auto-ingestion data-transfer agents, and web-based interfaces similar to spreadsheets you can parse and calculate new metadata to increase dimensionality and insights, using server-side computing, which is a challenge for user-side spreadsheet tools.

By Nina Robbins

Securing The Future Of ROI With Simulation Decision Support

EDITOR’S NOTE: This article is about how to approach and think about Decision Simulation. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

EXCERPT

Simulation is simply the idea of imitating human or other environmental behaviors to test possible outcomes. It is obvious a business will want to take advantage of such Simulation technologies in order to maximize profits, reduce risks and/or reduce costs.

Simulation decision support is the backbone of many cutting edge companies these days. Such simulations are used to predict financial climates, marketing trends, purchasing behavior and other tidbits using historical and current market and environmental data.

Managing ROI

Data management is a daunting task that is not to be trusted in the hands of lose and unruly processes and technology platforms. Maximizing profit and/or reducing risks using simulated information will not be an automatic process but rather a managed task. Your business resources should be leveraged for each project needing long term ROI planning; computer simulations are just some pieces to the overall puzzle. Simulation decision support companies and platforms are not exactly a dime a dozen but should still be evaluated thoroughly before engaging.

Scaling Your Business

Modern software platforms exist to assist in the linear growth of your business initiatives. Algorithms have been produced thanks to years of market data and simulations in order to give a clear picture to your expectations and theories. Machine learning has also been rapidly improving over that past several years, making market simulations even more accurate when it comes to short and long-term growth. There is no lack of Algorithms or libraries of Data science modules, it is the ability to easily scale your core and alternative data sets into and easy to use platform that is configured to your business environment. Over the last several years these Data Science platforms, such as Synaptik.co, has allowed companies with limited resources to scale their operations to take advantage of decisions simulation processes that were once too expensive and required specialized, separate resources to manage.

Non-tech Based Departments Can No Longer Hide

All branches of companies are now so immersed in software and data that it is difficult to distinguish the IT and non-IT departments. Employees will plug away at their company designated computing resources in order to keep records for the greater good of the corporation. These various data pools and processes are rich in opportunities to enable accurate business simulations. In turn, simulation findings can be shared with different departments and partners to enrich a collaborative environment to amplify further knowledge for a greater propensity for success. It is no joking matter that big or small companies will need simulation decision support processes to ensure they not only stay competitive but excel in their growth initiatives.

Data and Knowledge Never Sleeps

In 2016, the Domo research group produced data visualizing the extent of data outputted by the world. By 2020, the group predicts that we will have a data capacity of over 44 trillion gigabytes. This overwhelming amount of data available to the average human has companies on their toes in order to grasp the wild change in our modern world. The data produced is neutral to the truth, meaning accurate and inaccurate ideas are influencing the minds of your customers, partners and stakeholders. Scaling profits and reducing risk will become an increasingly involved activity, which gives us another reason to embark on Decision Simulation processes to deal with the overwhelming amount of data and decisions needed in this fluid data rich world.

EDITOR’S NOTE: This article is about how to approach and think about Decision Simulation. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

By Joe Sticca

Shocking? Predictive Analytics Might Be Right For Your Future

EDITOR’S NOTE: This article is about how to approach and think about Predictive Analytics. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

EXCERPT

“What is marketing?” Isn’t it the attempt to sell products and services to people who are most likely to buy them? Would you be shocked to learn that Predictive Analytics is useful for completing sales? We have a tendency to think of our processes/departments and data in silo-ed terms. Though, with today’s platforms it is critical to harness insights across silos as well as bring in “alternative data”.

How is your Data Management? Can your sales and marketing staff use your data sets to up-sell products or services?” Data management is the biggest barrier as well as the biggest opportunity to surpassing internal KPIs.

Know Your Customer.

“Have you ever heard of someone lamenting about things they should have done as youth to be successful adults?” They might read a good book and suggest “they could have written that themselves.” They think that the path to success is “obvious.” Simply know everything about your customer and provide him or her with valuable products or services. That is the secret to success. “But how do you get to know your customer?” The answer is Data Management and Predictive Analytics.

What Do You Know?

Customer Relationship Management (CRM) software has become very popular because it allows you to accumulate, manage and act upon client data. This can be an automatic data management system. You could even review past buying habits and automatically send an email for a hot new product, which might be appealing. Up Selling can increase your profits per customer. CRM is Business Analytics – giving you a deeper understanding of who your customer is, what he wants and where he is going. “Why do you think so many websites want to place cookies on your computer?” They want to track your behavior and anticipate your next buying action.

When Did You Know It?

“If you don’t know what your customer bought yesterday, how will you know what they will buy tomorrow?” The most agile business understands their customer in real-time. The Twitter world is about immediate gratification. People want to say “Hi,” see your pictures and plan your future together within the first 3 seconds, you meet. The profitable business knows the answers before the customer asks them. Predictive Analytics might be right for your future because it gives you the power to anticipate consumer buying trends and or behaviors across channels (Social, video, mobile, etc.). Your competitor might already be using these Business Analytics; you might be leaving “money on the table.” Sign up for a discussion, demo or strategy session today Hello@TrueInteraction.com.

EDITOR’S NOTE:This article is about how to approach and think about Predictive Analytics. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

How Alternative Data Can Transform Your Business Intelligence

EDITOR’S NOTE: This article is about harnessing new sources of Alternative Data. True Interaction built SYNAPTIK, our Data Management, Analytics, and Machine Learning Platform, specifically to make it easy to collect and manage core and alternative data/media types for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

Big data has been commonly described over the last few years through properties known as the “3 V’s”: Volume, Velocity, and Variety. If you are a human being just about anywhere in the world today, it’s patently obvious to you that these three dimensions are increasing at an exponential rate.

We’ve seen the staggering statistics with regards to Volume and Velocity reported and discussed everywhere:

Big Volume
IDC reported that the data we collectively create and copy globally is doubling in size every two years. Calculated at 4.4 zettabytes in 2014, the organization estimates global data will reach 44 zettabytes — that’s 44 trillion gigabytes — by 2020.
Cisco forecasts that overall mobile data traffic is expected to grow to 49 exabytes per month by 2021, a seven-fold increase over 2016. Mobile data traffic will grow at a compound annual growth rate (CAGR) of 47 percent from 2016 to 2021.

Big Velocity

Facebook’s 1.97 billion monthly active users send an average of 31.25 million messages and view 2.77 million videos every minute.

Twitter’s 308 million monthly active users send, on average, around 6,000 tweets every second. This corresponds to over 350,000 tweets sent per minute, 500 million tweets per day and around 200 billion tweets per year.

Big Variety = Alternative, Non-traditional, Orthogonal Data

These well-touted figures often leave one feeling aghast, small, and perhaps powerless. Don’t worry, the feeling is mutual! So today, let’s get ourselves right-sized again, and shift our focus to the 3rd dimension — of big data, that is — and examine a growing, more optimistic, and actionable business trend concerning big data that is materializing in organizations and businesses of all kinds, across just about any industry that you can imagine, without regard for business size or scope. Let’s examine the explosion of big data Variety, specifically with regards to harnessing new and emerging varieties of data to further inform reporting, forecasting, and the provision of actionable BI insights.

In a pattern similar to online retail’s “Long Tail” — the emphasis of niche products to consumers providing that emerged in the 2000’s — more and more future-leaning businesses are incorporating outside, alternate “niches” of data that differ from the traditional BI data sources that standard BI dashboards have commonly provided.

In a recent interview in CIO, Krishna Nathan, CIO of S&P Global explained that “Some companies are starting to collect data known as alternative, non-traditional or orthogonal.” Nathan further describes Alternative Data as the various data “that draw from non-traditional data sources, so that when you apply analytics to the data, they yield additional insights that complement the information you receive from traditional sources.” Because of the increasing prevalence of data from mobile devices, satellites, IoT sensors and applications, huge quantities of structured, semi-structured and unstructured data have the potential to be mined for information and potentially help people make better data-driven decisions. “While it is still early days for this new kind of data”, Nathan says, “CIOs should start to become familiar with the technologies now. Soon enough, alternative data will be table stakes.”

In The Field

Let’s examine the various applications of these new data sources that are manifesting themselves in parallel with the burgeoning technological advancements in our world today.

VC and Credit

Alternative data is increasingly wielded by VC firms as well as the credit industry to lend insight into backing startups, businesses, and technologies. Many small businesses, especially those with a limited credit history, have difficulty demonstrating creditworthiness and may be deemed as high risk when viewed through the lens of traditional data sources.

However, Experian recently described the growing number number of online marketplace lenders, or nonbank lenders, that have already begun taking a nontraditional approach by leveraging a wealth of alternative data sources, such as social media, Web traffic, or app downloads to help fill the void that a business with limited credit history might have. By combining both traditional and nontraditional data sets, these lenders are able to help small businesses access financial resources, while expanding their own portfolios.

Health

Patient information continues to be collected through traditional public health data sources, including hospital administration departments, health surveys and clinical trials. Data analysis of these sources is slow, costly, limited by responder bias, and fragmented.

However, According to MaRS DD, a research and science-based entrepreneurial venture firm, with the growing use of personal health applications among the public, self-reported information on prescription drug consumption and nutritional intake can be analyzed and leveraged to gain insight into patient compliance and use patterns, as well as into chronic disease management aptitude in between visits to frontline healthcare practitioners. In addition, social media platforms can be used as both monitoring tools and non-traditional methods of increasing patient engagement, as they allow healthcare professionals to interact with populations that under-utilize services. Healthcare organizations can mine social media for specific keywords to focus and project initiatives that track the spread of influenza, zika, or opioid addiction, for example, or even to provide real-time intervention.

Retail, Dining, Hospitality and Events

Several different kinds of data sources can give these industries a bigger picture and aid in both more granular reporting, but also more accurate forecasting. For example, Foursquare famously predicted that Chipotle same-store sales would fall 29 percent after the Mexican chain was hit with E. coli outbreaks, based upon check-ins on their application. The actual decline announced by Chipotle ended up being a spot-on 30 percent. It’s no coincidence that Foursquare recently announced Foursquare Analytics, a foot traffic dashboard for brands and retailers.

In addition, by making use of CCTV or drone imagery, incredible insight can be garnered from examining in-store foot traffic or the density of vehicles in a retailer’s parking lot over time. Today, a combination of Wi-Fi hotspots and CCTV cameras can compile numbers about in-store customer traffic patterns in the same way that online stores collect visitor and click information. For example, by using a modern CCTV system to count the number of people in each part of the store, heatmap analytics can visualize “hot zones” — to help maximize in-store promotional campaigns, and identify “cold zones” to determine how store layout changes can improve customer traffic flow.

Don’t forget the weather! By leveraging a real-time weather data analytics system in order to process historical, current, and forecasted weather data, retailers can predict how shifting demands will affect inventory, merchandising, marketing, staffing, logistics, and more.

Wall Street

You can bet your life that investment firms are early adopters of alternative data sources such as in the Chipotle-Foursquare story mentioned earlier. Consider the incredible resource that satellite imagery is becoming — it’s not just for government intelligence anymore: Because satellite imagery now enables organizations to count cars in retailers’ parking lots, it is possible to estimate quarterly earnings ahead of a business’ quarterly reports. Data analysts can use simple trigonometry to measure the shadows cast by floating oil tank lids in order to gauge the world’s oil supply. By monitoring vehicles coming and going from industrial facilities in China, it’s even possible to create a nascent China manufacturing index. Infrared sensors combined with satellite images can detect crop health far ahead of the USDA. All of this makes a boon for traders and investors.

What About Your Organization?

No matter the size of your business, now is the time to consider upgrading your 2000’s-era BI Dashboards to incorporate alternative data sources — remember, the convergence of IoT, cloud, and big data are creating new opportunities for analytics all the time. Data is expected to double every two years, for the next decade. Furthermore, it is essential to integrate all of these data opportunities with traditional data sources in order to create a full spectrum of analytics, and drive more intelligent, more actionable insights.

The Right Analytics Platform

Legacy data management systems that have not optimized their operations will not be able to process these new and disparate sources of alternative data to produce relevant information in a timely manner. The lack of machine learning mechanisms within these sub-optimal systems will hinder businesses in their knowledge discovery process, barring organizations from making data-driven decisions in real time.

According to Joe Sticca, Senior Executive of Digital Transformation & Data Science for True Interaction, “The most deleterious disadvantage of failing to address these pressing issues… is the careless neglect of invaluable business insight that is concealed in the mass of available data. Now, more than ever, businesses of all size need the ability to do great data discovery, but without necessitating a deep core technical development and data analyst skillset to do so.”

One solution path? Cutting-edge fully-managed data and machine learning platforms like Synaptik, that make it easy to connect with dozens of both structured and unstructured data services and sources, in order to gain the power of algorithms, statistical analysis, predictive modeling and machine learning, for a multitude of purposes, and metrics such as brand sentiment, campaign effectiveness and customer experience. Synaptik helps businesses transform via an adaptive, intuitive and accessible platform – using a modern mix of lightweight frameworks, scalable cloud services, and effective data management and research tools. More importantly, it works with non-IT skill sets to propagate better pattern recognition across your organization’s people and divisions.

(infographic by Quandl.com)

by Michael Davison

3 Issues with Data Management Tools

The market is currently awash with BI tools that advertise lofty claims regarding their ability to leverage data in order to ensure ROI. It is evident, however, that these systems are not created equally and the implementation of one could adversely affect an organization.

Cost

While consistent multifold increases of the digital universe is ushering in lower costs for data storage, a decline reported to be as much as 15-20 percent in the last few years alone, it is also the catalyst for the the rising cost of data management. It seems that the cause for concern regarding data storage does not lie in the storage technologies themselves, but in the increasing complexity of managing data. The demand for people with adequate skills within the realm of data management is not being sufficiently met, resulting in the need for organizations to train personnel from within. The efforts required to equip organizations with the skills and knowledge to properly wield these new data management tools demand a considerable portion of a firm’s time and money.

Integration

The increased capacity of a new data management system could be hindered by the existing environment if the process of integration is not handled with the proper care and supervision. With the introduction of a different system into a company’s current technological environment as well as external data pools( i.e. digital, social, mobile, devices, etc.), the issue of synergy between the old and new remains. CIO identifies this as a common oversight and advises organizations to remain cognizant of how data is going to be integrated from different sources and distributed across different platforms, as well as closely observe how any new data management systems operate with existing applications and other BI reporting tools to maximize insight extracted from the data.

Evan Levy, VP of Data Management Programs at SAS, shares his thoughts on the ideal components of an efficient data management strategy as well as the critical role of integration within this process, asserting that:

“If you look at the single biggest obstacle in data integration, it’s dealing with all of the complexity of merging data from different systems… The only reasonable solution is the use of advanced algorithms that are specially designed to support the processing and matching of specific subject area details. That’s the secret sauce of MDM (Master Data Management).”

Reporting Focus

The massive and seemingly unwieldy volume is one major concern amidst this rapid expansion of data, the other source of worry being that most of it is largely unstructured. Many data management tools offer to relieve companies of this issue by scrubbing the data clean and meticulously categorizing it. The tedious and expensive process of normalizing, structuring, and categorizing data does admittedly carry some informational benefit and can make reporting on the mass of data much more manageable. However, in the end, a lengthy, well-organized report does not guarantee usable business insight. According to research conducted by Gartner, 64% of business and technology decision-makers have difficulty getting answers simply from their dashboard metrics. Many data management systems operate mostly as a visual reporting tool, lacking the knowledge discovery capabilities imperative to producing actionable intelligence for the organizations that they serve.

The expenses that many of these data management processes pose for companies and the difficulties associated with integrating them with existing applications may prove to be fruitless if they are not able to provide real business solutions. Hence, data collection should not be done indiscriminately and the management of it conducted with little forethought. Before deciding on a Business Intelligence system, it is necessary to begin with a strategic business question to frame the data management process in order to ensure the successful acquisition and application of big data, both structured and unstructured.

Joe Sticca, Chief Operating Officer of True Interaction, contributed to this post.

By Justin Barbaro

The Top 9 Things You Need to Know About BI Software

Learning more about the data your business collects is important to evaluating the decisions you make today, next year, and in the next decade; that’s called business intelligence. But while software can do that for you, figuring out what software you should use can be a perplexing process.

The first step to evaluating that software is to figure out the platform that you’ll be using—both its workflow and its platform. You’ll also need to establish your goals and objectives, and understand who needs to use that business intelligence. Any software like this will require training—both on its purchase and on its continued use. And your software should also provide solutions for security.

As you evaluate your software, you have to ask questions—and more questions. How much support will you get and what features are on your roadmap.

Want to work through the options? Use this handy graphic below for steps and concerns

by Michael Davison

6 Protips on Getting the Most out of Your Big Data

Here’s some interesting news: a recent Teradata study showed a huge correlation between a company’s tendency to rely on data when making decisions, and its profitability and ability to innovate. According to the study, data-driven companies are more likely to generate higher profits than competitors who report a low reliance on data. Access to data and quantitative tools that convert numbers to insights are two to three times more common in data-centric companies – as well as being much more likely to reap the benefits of data initiatives, from increased information sharing, to greater collaboration, to better quality and speed of execution.

Today Big Data is a big boon for the IT industry; organizations that gather significant data are finding new ways to monetize it, while the companies that deliver the most creative and meaningful ways to display the results of Big Data analytics are lauded, coveted, and sought after. But for certain, Big Data is NOT some magic panacea that, when liberally applied to a business, creates giant fruiting trees of money.

First let’s take a look at some figures that illustrate how BIG “Big Data” is.

– A few hundred digital cameras together have enough combined memory to store the contents of every printed book in the Library of Congress.

– Just 10 minutes of world email content is, again, the contents of every printed book in the Library of Congress. Thats equal to 144x the contents of the Library of Congress every day.

Every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone.

Only 3% of potentially useful data is tagged, and even less is analyzed.

In 2010 there was 1 trillion gigabytes of data on the Internet; that number being predicted to double each year, reaching 40 trillion gigabytes by the year 2020.

The sheer size of large datasets force us to come up with new methods for analysis, and as more and more data is collected, more and more challenges and opportunites will arise.

With that in mind, lets examine, 6 things to keep in mind when considering Big Data.

1. Data analytics gives you AN answer, not THE answer.

In general, data analysis cannot make perfect predictions; instead, it might make predictions better than someone usually could without it. Also, unlike math, data analytics does not get rid of all the messiness of the dataset. There is always more than one answer. You can glean insights from any system that processes data and outputs an answer, but it’s not the only answer.

2. Data analytics involves YOUR intuition as a data analyst.

If your method is unsound, then the answer will be wrong. In fact, the full potential of quantitative analytics can be unlocked only when combined with sound business intuition. Mike Flowers, chief analytics officer for New York City under Mayor Bloomberg, explained the fallacy behind either-or thinking as such: “Intuition versus analytics is not a binary choice. I think expert intuition is the major missing component of all the chatter out there about analytics and being data driven.”

3. There is no single best tool or method to analyze data.

There are two general kinds of data, however not all analytics will necessarily include both, and as you might expect, they need to be analyzed differently.

Quantitative data refer to the information that is collected as, or can be translated into, numbers, which can then be displayed and analyzed mathematically. It can be processed using statistical methods such as calculating the mean or average number of times an event or behavior occurs over a unit of time.

Because numbers are “hard data” and not subject to interpretation, these methods can give nearly definitive answers to different questions. Various kinds of quantitative analysis can indicate changes in a dependent variable related to frequency, duration,intensity, timeline, or level, for example. They allow you to compare those changes to one another, to changes in another variable, or to changes in another population. They might be able to tell you, at a particular degree of reliability, whether those changes are likely to have been caused by your intervention or program, or by another factor, known or unknown. And they can identify relationships among different variables, which may or may not mean that one causes another. http://ctb.ku.edu/en/table-of-contents/evaluate/evaluate-community-interventions/collect-analyze-data/main

Qualitative data are items such as descriptions, anecdotes, opinions, quotes, interpretations, etc., and are generally either not able to be reduced to numbers, or are considered more valuable or informative if left as narratives. Qualitative data can sometimes tell you things that quantitative data can’t., such as why certain methods are working or not working, whether part of what you’re doing conflicts with participants’ culture, what participants see as important, etc. It may also show you patterns – in behavior, physical or social environment, or other factors – that the numbers in your quantitative data don’t, and occasionally even identify variables that researchers weren’t aware of. There are several different methods that can be used when analyzing qualitative data:

Content Analysis: In general, start with some ideas about hypotheses or themes that might emerge, and look for them in the data that you have collected.

Grounded Analysis: Similar to content analysis in that it uses similar techniques for coding, however you do not start from a defined point. Instead, you allow the data to ‘speak for itself’, with themes emerging from the discussions and conversations.

Social Network Analysis: Examines the links between individuals as a way of understanding what motivates behavior.

Discourse Analysis: Which not only analyses conversation, but also takes into account the social context in which the conversation occurs, including previous conversations, power relationships and the concept of individual identity.

Narrative Analysis: Looks at the way in which stories are told within an organization, in order to better understand the ways in which people think and are organized within groups.

Conversation Analysis: Is largely used in ethnographic research, and assumes that conversations are all governed by rules and patterns which remain the same, whoever is talking. It also assumes that what is said can only be understood by looking at what happened both before and after.

Sometimes you may wish to use one single method, and sometimes you may want to use several, whether all one type or a mixture of Quantitative or Qualitative data. Remember to have a goal or a question you want to answer – once you know what you are trying to learn, you can often come up with a creative way to use the data. It is your research, and only you can decide which methods will suit both your questions and the data itself. Quicktip: Make sure that the method that you use is consistent with the philosophical view that underpins your research, and within the limits of the resources available to you.

4. You do not always have the data you need in the way that you need it.

In a 2014 Teradata study, 42% of respondents said they find access to data cumbersome and not user-friendly. You might have the data, but format is KEY: it might be rife with errors, incomplete, or composed of different datasets that have to be merged. When working with particularly large datasets, oftentimes the greatest timesink – and the biggest challenge – is getting it into the form you need.

5. Not all data is equally available.

Sure, some data may exist free and easy on the Web, but more often than not, the sheer volume, velocity, or variety prevents an easy grab. Furthermore, unless there is an existing API or a vendor makes it easily accessible by some other means, you will ultimately need to write a script or even complex code to get the data the way you want it.

6. While an insight or approach adds value, it may not add enough value.

In Broken Links: Why analytics investments have yet to pay off, from the Economist Intelligence Unit (EIU) and global sales and marketing firm ZS, found that although 70% of business executives rated sales and marketing analytics as “very” or “extremely important”, just 2% are ready to say they have achieved “broad, positive impact.”

In 2013, The Wall Street Journal reported that 44% of information technology professionals said they had worked on big-data initiatives that got scrapped. A major reason for so many false starts is that data is being collected merely in the hope that it turns out to be useful once analyzed. This type of behavior is putting the cart before the horse, and can be disastrous to businesses – again, remember to have a goal or question you want to answer.

Not every new insight is worth the time or effort to integrate it into existing systems. No insight is totally new. If every insight is new, then something is wrong.

Hopefully these tips will set you off in the right direction when you are considering to incorporate additional datasets and their associated analytics platforms into your business process. Good Luck!

By Michael Davison

Is Your IT Department Ready for the Digital Age?

If you are smiling to yourself about the title of this post, and the quaint term “Digital Age,” and how it’s 2016 already, and the “Digital Age” has been upon us for years now, you may want to stack your SMB up against a few eye-opening metrics regarding the state of technology in small-to-medium businesses today.

Big data is upon us, and available to every enterprise, including small businesses – but it’s what you do with it that counts. The International Data Corporation (IDC) forecasts a 44-fold increase in data volumes between 2009 and 2020. Despite this cambrian explosion of data, SMBs still appear to be behind when it comes to their IT capability. IDC projects a 40% growth in global data per year vs. just 5% growth in global IT spending in the future; furthermore, the organization noted that a shocking 68% of companies do not have a stated Business Intelligence / Analytics Strategy, and 79% of SMBs still use manual integration such as manual Excel files, or custom code.

Companies are no longer suffering from a lack of data—they’re suffering from a lack of the right data. Business leaders need the right big data to effectively define the strategic direction of the enterprise. The current generation of software was designed for functionality, but the next generation must also be designed for analytics. ~ Accenture Business Technology Trends Report

Right now amongst progressive SMBs, the race is on to develop and realize a true digital business ecosphere. Progressive SMBs are 55% more likely to have fully integrated business applications. Where does your organization stand in this contest? IDC predicts that by 2017, the transfer of cloud, social and big data investments from IT to line-of-business budgets will require 60% of CIOs to focus the IT budget on business innovation and value. This metric jibes with the results from Accenture’s Technology Vision survey, polling more than 2,000 business and technology executives across nine countries and 10 industries. According to the survey, 62% of SMBs are currently investing in digital technologies, and 35 percent are comprehensively investing in digital as part of their overall business strategy. In a recent article, Laurie McCabe, Partner at tech industry research masterminds SMB Group, points out that that these progressive SMBs are well positioned to tap into new customer requirements, improve customer engagement and experience, and enter new markets. “As progressive SMBs move forward,” she notes, “they will continue to outpace their peers and reshape the SMB market.”

How will the market be reshaped? Accenture notes that 81 percent of companies believe that “in the future, industry boundaries will dramatically blur as platforms reshape industries into interconnected ecosystems”. Progressive SMBs that continue to invest in IT capability will reap tremendous gains; those that bring up the rear will be behind by orders of magnitude. Furthermore, being progressive leads directly to revenue: For instance, according to SMB Group, 75% of the Progressive medium businesses (who increased technology spending) anticipated revenue gains in 2012, compared to just 17% of medium businesses that decreased IT spending.

Naturally, some SMBs don’t have the budget or staff to “flip the switch” and implement a bottom-up overhaul of their entire business process to create a fully integrated digital solution. But that is OK. By working with a capable digital business enterprise development vendor, SMBs can commence an incremental, but still integrated approach to business management solutions. Companies can begin, for example, with a financials module, and then continue to add integrated modules as required and when able, to manage other functions such as manufacturing, distribution, project accounting, or sales and marketing, at their own pace. The important part is to get moving, and to take the time to honestly assess how your organization is using technology today.

True Interaction produces custom full-stack end-to-end custom secure and compliant technology solutions across web, desktop and mobile – integrating data sources from e-commerce, enterprise resource planning, customer service, document inventory management, spend, performance… whatever data your business requires. From legacy systems to open source, we can determine the most optimal means to achieve operational perfection, devising and implementing the right tech stack to fit your business. We routinely pull together disparate data sources, fuse together disconnected silos, and do exactly what it takes for organizations to operate with tight tolerances, making your business engine hum.

Are you ready for the Digital Age? For real, this time? Let’s go!

By Michael Davison