Categories
Insights

Three Digital Marketing Innovations for Higher Education

EDITOR’S NOTE: This article is about how higher education marketing teams can transform their current strategies using digital transformation. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage data, including marketing data, for more meaningful insights. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

Categories
Insights

Here Comes Fake Data!

EDITOR’S NOTE: This is a guest article authored by Geber Consulting. The original article was featured hereTrue Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

Categories
Insights

Making Human Resources More Inclusive, Driven By Data

EDITOR’S NOTE: This article is about how to approach and think about leveraging data to make human resources more inclusive across organizations. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to aggregate siloed data (i.e. from Human Resources, Finance, etc.) for more meaningful data discovery.

We know that Machine Learning (ML) and Artificial Intelligence (AI) will transform the future of work itself, but how will it affect the processes by which organizations choose, develop and retain their workers?

Katherine Ullman is a data scientist at Paradigm, a strategy consulting firm using social science to make companies more inclusive. Paradigm has partnered with a range of clients,  including Airbnb, Pinterest, Asana, and Slack, among others. Katherine and I had a recent discussion about how her organization works with data and machine learning to assist clients in better understanding the impact of their people processes including recruitment, selection, training and retention of underrepresented groups.   

(NOTE: The word “impute” below is a technical term that refers to the assigning of a value by inference.)

TI: Being a data scientist at an human resources & strategy consultancy that leverages social science to make companies more inclusive sounds like an amazing job. Can you provide some insight into your daily work and the work you do with clients?

“One of the core services I work on as a data scientist is our comprehensive Diversity & Inclusion Assessment. (The assessment) is a multi-month project designed to identify barriers and design client-specific strategies for diversity and inclusion. We do that by collecting and analyzing both quantitative and qualitative data about the client’s people processes and the outcomes of those processes.”

“The first thing I am doing is understanding, cleaning, and linking that data…that’s a surprisingly large part of the process. Once we clean it, we analyze the data to understand how an organization attracts, selects, develops and retains its workforce with a particular focus on underrepresented groups. What we’re looking for is how important, people-related outcomes in an organization – things like who is hired, who is promoted, and who stays or leaves – might vary depending on your identity. At the same time, our consultants are doing the qualitative research, and then we synthesize those findings internally.”

“(Then) our learnings from (the data) shape the strategic recommendations that we offer to clients to, again, improve how they attract, select, develop and retain all employees, and particularly those from underrepresented groups. Clients often come to us with a sense that they have room to improve with respect to diversity and inclusion, but don’t know where to focus their attention or what to do once they determine that focus. Our analyses provide insight not only into where our clients should concentrate their efforts and also provide clarity around what solutions to implement. For example, a client might believe they have a problem attracting a diverse set of applicants, but we find that their applicant pool is relatively diverse and underrepresented people are simply falling out of the funnel early in the hiring process. We might have that client concentrate less on active sourcing, then, and instead focus on ensuring their early stage selection processes are fair.”

TI: How do you wrangle all the diverse, silo-ed data and organize it for your internal analysis purposes?

“Storage is not usually a difficult issue in terms of size, but there are obviously security concerns that we take incredibly seriously, as we deal with sensitive data.”

“R is our main wrangling tool and we use some of the really great open source packages developed by the R community that allow us to create interactive dashboards and crisp visualizations as a means of communicating back insights, both internally to other members of our team and to our clients as well.”

rawpixel-com-351765-1.jpg 

TI: How does machine learning impact your current work? How are you using it?

“We use some ML techniques to impute missing demographic data. It’s often in the applicant data from recruiting software systems where we have the most missing data in terms of demographics. Once we impute the missing data, we are able to ask, for example, ‘Here are people from different demographic groups, how are they entering the application pipeline? Are they coming in through referrals? Through the company’s job website? Or through third party boards or events?’  A lot of the companies we work with actually track this information at a granular level, making it easy to gain insight about who is entering the funnel through what sources, and how successful those sources are.”

TI: How do you see machine learning impacting your work five years from now?

“(Currently, Paradigm) is using existing tools primarily for imputation, and we aren’t pushing the envelope too far. At this stage, I think this is wise. Our work with clients has real outcomes on people and you need to really know and take seriously the implications are of what you are doing when you are using these new and exciting tools.”

“I think we are going to continue to see a lot AI and machine learning move into the HR space. There are examples of companies who are already using this well–like Textio–but I think it’s important to be both optimistic and suspicious about new technologies in this space. Do we want machine learning to make hiring decisions? One might argue that is going to remove bias from the process because it’s removes the need for human judgement, but at the same time you have to wonder, what is the data underlying these models? It is very difficult to find data that links characteristics of people and their employment outcomes that is free from human bias, so any machine learning that built on that data is likely to replicate those issues.”

“But there are reasons to be optimistic about the future of machine learning. For example, I am seeing a lot of work to actually diversify the machine learning industry. The  advocacy there is really important because people are starting to understand that who makes thee tools matters a lot.”

TI: What recommendations do you have for organizations who want to use data to understand and improve their current HR practices?

“A lot of (companies) already have recruiting software – applicant tracking systems – because even at small companies recruiting and hiring is such a heavy lift that most people find themselves looking into systems that will help them make that easier.”

“I think where companies can improve is really honoring the recording of data that isn’t auto-populated through every ATS (applicant tracking system). For example, in applicant data, taking the time to record every applicant that comes through a referral and who that referral is. This is really important to understand the success of hiring sources, especially for some of the larger companies we’ve worked with. Even if less than 3% of the applicant pool are referrals, we may end up finding that referrals comprise over 30% of hires. When companies really make sure that everyone getting referred is documented, we can feel confident and clear in our insights. Most companies are collecting this data in some way, but there’s a lot of variance in the quality of data that individuals need to record themselves.”

“I’m really looking forward to the development of more HRIS/ATS systems in this space that will streamline data collection and link various systems (performance, recruiting, internal surveys, etc). Until then, I think the (best) thing to do is to really honor the data collection process with an eye towards making it legible for other people internally or externally to use in the future. This will happen naturally as people analyst positions become more of a norm, but until then, I think people think of the data collection process as just a burden with no end goal. I get that, but if done well, it really gives us (and our clients) the opportunity to gain meaningful and accurate insights.”

Categories
Insights

Social Media Principles, Evolution and Future

Marketing is a process as old a business itself. However, the invention and proliferation of digital advertising, often through social media channels, has transformed the ways in which brands and their marketing agencies interact with current and potential customers.

Rich Taylor, a Chief Marketing Officer (CMO) and executive-level consultant with three decades of industry experience sat down with the True Interaction team to share his expertise on how companies leverage the power of social media to better market their products. Below are his insights into the changing digital advertising space.

 

TI: What do you see as the key principles of effective social media use by brands and agencies?

It’s not advertising. “There has been a push from all the social channels to monetize social (media), and in that regard they get advertisers to pay for eyeballs. But all the research shows that when you run advertising on social channels it does not work well. People are not tuning in to social media to watch ads. What they want is content marketing…don’t create ads; engage your audience.”

Focus on the service component. “Social media is a chance to interact with your customers, to thank them for their purchases and positive comments, address a concern or issue they have promptly. That’s what people want. People want to know that you are paying attention, listening and that you will respond.”

Personify your use of social media. “Social (media) is about interacting with friends and family or colleagues and other professionals. There is an inherently personal element to it. Having a voice or personality matters. You may be a great big successful brand or a small startup brand, but if you haven’t established a tone, a personality, a voice, a manner of interacting, you are not going to get the type of engagement you are really looking for.”

 

TI: True Interaction helps companies across a number of industries in automating their data ingestion, management and visualization processes. How do you see companies successfully using automation in the social media space?

“The trend is towards convenience. ‘How do I interact and make it very easy for my customer?’ (For example) Domino’s uses chatbots and artificial intelligence to address frequently asked questions and also to streamline orders…they created an automation mechanism within social media so that when you sent a pizza emoji or image it would trigger an order based on your profile. They are using automation and tools to say, ‘I have you engaged somewhere, Instagram, Twitter, Facebook, Google. Now, how do I take that interaction quickly and easily from our marketing group to our sales and customer service areas?’.”

“Also, digital personalization. The Amazon Reviews Effect of ‘other customers like you bought this’…There is a lot of technology and automation now being done to personalize the content you see based on your browsing history, your shopping history, what they know about you. And they gather that information from your email, from cookies, from a lot of different things. I think applying and using personal data is the challenge now for a lot of marketers.”

 

TI: Solutions like chatbots, digital personalization and machine learning may, at this time, be most attainable for more established, well resourced brands. What social media solutions would you recommend to smaller business with greater resource scarcity?

“The interesting thing that I am seeing is big companies like Walmart…who can afford to ingest very expensive and sophisticated shopper data and analytics programs….puts a lot of pressure on mom and pops and regional players. The great news is that nowadays, historic barriers to success have been the costs of IT infrastructure and systems…but what you are finding is that merchants vendors themselves are helping (small businesses)…American Express is an example. (American Express) recognizes that (small businesses) want to have access to what they are using their cards for. If AmEx has this capability, (a small business) does not have to be a Walmart that is going to build (their own system) or bring someone to build it for them. They can just talk to American Express or another company that is in the digital space who is now going to have the capability to do this work and that to not require a lot of resources to access.You just need a combination of access to the tool and bring on some talent or some consultants who can help you mine that data.”

 

TI: How has social media evolved during your time as a marketer?

Social Media 1.0: “Blasting out one-size-fits all ads to everybody.”

Social Media 2.0: “Traditional advertising methods…versioning content based on a segment of customers.”

Social Media 3.0: “Big data. I am going to be using big data, whether it’s shopping data or a combination of shopping data and search data to more precisely personalize the content to an individual and make it relevant to them. For example, retargeting ads. If you are on Zappos.com and you look for a pair of shoes, next time you log into Facebook, all of a sudden you see the shoes you were looking for being retargeted to you in the ad.”

 

TI: So what is the future of social media? In other words, what is Social Media 4.0?

“Thats a good question. I think we are starting to see a little fatigue with social media. You are not seeing new platforms pop up and succeed as fast as they did. I think at some point Facebook and Twitter are not the channels you tune into anymore….I am going to set up Rich Taylor’s network, and it is going to be my personalized list…I am going to build my own media and entertainment channel and it is going to be based on my habits and preferences for information.”

“People are spending a lot of time on Facebook right now, but that’s because Facebook is holding them hostage and they can’t get their content anywhere else…I think you are going to see social media going to get more focused on integrating with broadcast. How does broadcasting merge ultimately with digital? Broadcasting is the powerhouse in advertising, it’s the easiest form of entertainment, and I think that you are seeing a push towards more and more video content, easy to digest.”

“At some point, I will be watching my TV or a program on my tablet, and there might be a scroll or a ticker, like financial news, scrolling underneath with my social elements in it and I might see a quick blip or a picture or an image that might say, ‘you gotta see this!’ and then I might click on it, it will open up, and I will watch it.”

 

EDITOR’S NOTE: This article is about how to approach and think about social media and digital advertising. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

Categories
Insights

Cybersecurity: Is Machine Learning the Answer?

EDITOR’S NOTE: This article is about how to approach and think about cybersecurity. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

The U.S. Department of Homeland Security designated the current month of October as National Cybersecurity Awareness Month. Cybersecurity is currently at the forefront of many American’s minds following the sweeping data breach earlier this year at Equifax, a consumer credit reporting agency. The current total of Americans affected by the breach stands at 145.5 million; however, the data breach is arguably the greatest of all time for its depth in addition to its scale. Missing data from victims includes names, birthdays, addresses and Social Security numbers. Even more alarming, consumer’s security answers and questions may have also been breached, providing hackers and their clients the ability to lock victims out of their private accounts by altering passwords and other account settings.

But with advances in machine learning (ML) and artificial intelligence (AI) technologies to snuff out potential malware threats, shouldn’t business users be able to construct more robust fortresses for their data?

The answer is yes, but this optimism should be tempered as the potential is more limited than many might think.

Some brief definitions require clarification before exploring key limitations of applying machine learning and AI algorithms for cybersecurity measures.

Machine Learning is the ability for computer programs to analyze big data, extract information automatically and learn from it.1

Artificial Intelligence is the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings such as the ability to reason, discover meaning, generalize or learn from past experience.2

Cybersecurity is the practice of protecting systems, networks and programs from digital attacks. These attacks are usually aimed at accessing, changing, or destroying sensitive information, extorting money from users or interrupting normal business processes.3

So why should we temper our expectations when deploying machine learning algorithms for cybersecurity needs?

Machine Learning is not AI. Machine learning has amazing capabilities including the ability to analyze and learn from big data sets. However, the ability to learn should not be confused with the ability to reason or self-reflect, two characteristics of AI and humans. Therefore:

Machine Learning Will Identify Data Anomalies Better than Malware. Machine Learning algorithms can become very adept at identifying anomalies in large data sets, especially if they have a generous amount of training data. However, spotting threats becomes infinitely more complex if the algorithm must determine good from bad anomalies as well as unforeseen randomness. As a result:

Machine Learning Will Likely Produce Excessive False Positives. These false positives of malware identification could represent over 99% of anomalies. Regardless, any surfacing anomalies may require human-follow up, quickly zapping limited cybersecurity resources faced by many organizations.4

The question remains: how should cybersecurity leaders in organizations leverage machine learning to sniff out malevolent attacks? The answer combines the power of people with the power of machines. Heather Adkins, director of information and privacy and a founding member of Google’s security team, recommends that companies “pay some junior engineers and have them do nothing but patch”.However, as Machine Learning cybersecurity algorithms are fed more data and supplemented with isolation capabilities that confine breaches for human study, a symbiosis between people and machines can prove more effective than silo-ed efforts to combat malicious threats online.

What might more concrete solutions look like? Attend our upcoming panel discussion to find out more.

Aligned with the goals of Cybersecurity Awareness Month, True Interaction is co-hosting a panel discussion this Thursday evening alongside the law firm PWBT (Patterson, Belknap, Webb and Tyler LLP), the digital agency Tixzy, and the strategic IT services provider Optimum Partners. Topics will include a high-level discussion on the current cybersecurity regulatory landscape, 3rd party risk and insider threats.

 

{{cta(‘cd8528cc-12b8-43a0-bb5e-53671d3cfdbd’)}}

 

1. https://www.forbes.com/sites/forbestechcouncil/2017/08/07/what-is-machine-learning/#2de9f74679a7

2. https://www.britannica.com/technology/artificial-intelligence

3. https://www.cisco.com/c/en/us/products/security/what-is-cybersecurity.html

4. https://www.forbes.com/sites/forbestechcouncil/2017/08/21/separating-fact-from-fiction-the-role-of-artificial-intelligence-in-cybersecurity/#13ad1fe81883

5. https://www.cnbc.com/2017/09/18/google-security-veteran-says-internet-not-secure-ai-wont-help.html

Categories
Insights

SEO = No Longer Enough

EDITOR’S NOTE:This article is about how to approach and think about Search Engine Optimization (SEO). True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

So you have built an amazing product and an equally amazing website to match. What can you do to stand out and get found online amongst fierce competition? You need to leverage every tool within your reach – in particular ones that are of low or minimal cost. Search Engine Optimization (SEO) is one of the best tools to make sure your potential clients find your site, and new advances in Machine Learning technology hold the power to take SEO strategy to the next level.

A clear understanding of Machine Learning, Search Engine Optimization (SEO) and Content Marketing are necessary before discussing how the former can positively impact the latter. For the purpose of this post:

Machine Learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it learn for themselves.1

Search Engine Optimization (SEO) is a methodology of strategies, techniques and tactics used to increase the number of visitors to a website by obtaining a high-ranking placement in the search results page of a search engine (SERP) including Google, Bing and other search engines. Using specific keywords on landing pages is one of the best-known tactics to increase the amount of site visitors.2

Content Marketing: a type of marketing that involves the creation and sharing of online material (such as videos, blogs, and social media posts) that does not explicitly promote a brand but is intended to stimulate interest in its products or services.3

So how can Machine Learning drive SEO for your website? One primary use case is through listening for, identifying and even predicting the optimal keywords to drive users to your site. This process may look similar to the one below:

Listen for new SEO Keywords
. Many Natural Language Processing tools, and more specifically Sentiment Analysis tools, that harness Machine Learning algorithms enable business users to track what potential customers are saying and how they feel about social media content over time. Analyses run on these findings can allow users to identify common noun-based or emotion-based keywords to display in prominent locations on their landing and internal pages to increase SEO scores.

Seek out connections between SEO Keywords. Cluster analyses automated through Machine Learning can allow business users to find hidden connections between SEO keywords. Following social listening tool utilization, business users can visualize groupings of emergent noun-based or emotion-based keywords to determine connections between keywords. Doing so can empower business users to organize their landing and other pages across popular keyword clusters to increase site SEO score.

Predict the future of SEO Keywords.
Time Series analysis and forecasting allows business users to utilize machine learning to take a look into the future of SEO your business. By harnessing Machine Learning libraries like TensorFlow, business users can leverage exiting natural language and sentiment data gleaned from social and other digital sources to look for cyclical or other patterns in keyword common noun-based or emotion-based keywords. These predictive capabilities can allow business users to leverage adaptability to constantly stay ahead of the competition through SEO leadership.

So how can business users tap machine learning capabilities including natural language processing / sentiment analysis, cluster analysis and time series analysis on their own data with one easy-to-use platform? Synaptik. The Synaptik platform also provides data management and visualization, as well as customized plugins customized to the needs of business users. Sign up for a 30 minute consultation and we can show you what customers are saying about your products and services across multiple social media channels online (Facebook, Twitter, LinkedIn, etc.).

1 expertsystem.com

2 webopedia.com

3 wikipedia.org

By Joe Sticca

Categories
Insights

What you need to know: The Future of FinTech, RegTech and Wealth Management in the Digital Space

The tipping point is here. High tech business intelligence tools with built-in machine learning algorithms and big data inputs were once reserved for the Fortune 500. Now, the FinTech fad has shifted from early stage adopter to mainstream money manager and former technophobes are starting to digitize their businesses from end to end. New low cost, user-friendly self-service tools that can produce rapid-fire insights and on-demand customer service are finally within reach and can provide family wealth managers the brain power they need without the additional headache. Synaptik, True Interaction’s “Plug, Play, Predict” machine platform is already serving companies in the space, providing value more quickly than industry norms.

Reuters white paper on the digitization of wealth management identified three drivers behind the mainstream movement towards FinTech:

– New tools for investment research, risk management, trade processing, compliance, and reporting
– New business models offering better, faster, cheaper variants of existing services in investment management and brokerage
– New marketplaces, new managers, and new financial products that are changing the way capital and risk are allocated

In this blog post, we’ll explore disruptive technologies traditional firms with limited IT expertise can use to beat the market, improve existing services and stay on top of an increasingly complex regulatory environment. By leveraging cloud, open source, big data, Artificial Intelligence, API and Chatbots, companies can create robust digital ecosystems that will win younger clients and increase profits across the board. Companies that continue to resist digital transformation run the risk of becoming less competitive while those that embrace the opportunity will benefit from supplementing talented human capital with technological know-how.

Courtesy of PWC

Beat the Market

Big name hedge funds and investment firms deploy AI to comb through the internet for new investment opportunities. The elusive “super-algo” can swallow huge amounts of information from news reports, databanks and social media platforms and quickly optimize portfolios to profit from microscopic ripples and seismic shifts in the market. While private family wealth managers have relied on traditional methods and experience to pinpoint good investment opportunities, machine learning can provide the edge they need to compete in a volatile world. Now, building data ecosystems that provide real-time information and time series data on company performance and consumer trends no longer requires a Ph.D. in data analytics or computer science.

When considering investment management software, companies should look for some key features including scenario simulation, modeling, portfolio rebalancing, performance metrics, yield curve analysis and risk analytics. Your software should also be flexible, adaptable and able to ingest structured and unstructured data. The costs of professional investment programs range from $1300 to $8000 but as the market matures costs are likely to go down.

Money Management on Demand

Wealth management firms have relied on traditional relationship-driven business models for decades. But the personal touch that keeps more senior clients happy may repel the next generation. To attract younger clientele, companies need to invest in on-demand, low-touch digital customer service models that provide better transparency and more autonomy to their clients. Creating a flexible digital strategy that allows different client segments to engage with their portfolio independently and with their advisor as little or as often as they want is key to success. EY’s report “Advice goes virtual” looks at the range of innovative wealth management models that are now available and highlights firms that have struck the perfect balance between automation and human capital. Companies like Personal Capital, Future Advisor and LearnVest provide digital platforms with phone-based financial advisor services to meet the needs of busy millennials and satisfy the clients that prefer a dedicated human that knows the future they want to make for themselves. EY’s chart on innovations in wealth management sums up the range of digital opportunities that clients are gravitating towards.


Courtesy of EY

Automated Compliance

Since the financial crisis, the cost of compliance has risen steeply. Tech Crunch reports that “the global cost of compliance is an estimated $100 billion per year. For many financial firms, compliance is 20% of their operational budget.” Innovations in RegTech, an offspring of FinTech, can automate certain components of the compliance process and have the potential to dramatically reduce the cost of doing business. The Institute for International Finance (IIF) defines “RegTech” as “the use of new technologies to solve regulatory and compliance requirements more effectively and efficiently.”

Since 2008, the increasing speed of regulatory change has kept wealth management firms in a state of paralysis. Companies are constantly playing catch up and readjusting procedures to meet new requirements. In the not so distant future, integrated RegTech solutions will connect directly with regulatory systems and automatically update formulae, allowing wealth management firms to refocus their resources on revenue generating activities.

Instead of producing lengthy paper reports for regulators, new RegTech solutions can generate and communicate required reports automatically. Instead of scouring hundreds of documents and spreadsheets on a quarterly basis, RegTech solutions will alert compliance managers to risks in real-time so they can be eliminated immediately. The possibilities are endless and the cumbersome and costly task of navigating the increasingly complex regulatory environment will continue to generate more innovations in this field. While RegTech is still in its infancy, small family wealth management firms should start investigating this growing subsector and use this disruptive technology to their advantage.

Traditional wealth management firms that continue to resist the digital revolution will begin to look antiquated, even to their most senior clientele. True Interaction specializes in building and executing digital transformation strategies for companies that don’t have IT expertise. Synaptik, True Interaction’s CMS for data, is already providing firms in the FinTech, RegTech and AdTech spaces with easy-to-use data management, visualization, and deep learning insights. Our experts are providing free consultations to help them assess their needs and start planning their digital future. Schedule your custom consultation here.

By Nina Robbins

Categories
Insights

Intelligent influence: How to use Google APIs to bolster your social media content

As the old adage goes, a picture is worth a thousand words. For social media influencers, however, a picture or video can be worth thousands of dollars.

Social media influencing, driven primarily by sharing photo and video content across popular social media channels, represents a $1B USD industry with expectations of doubling in the next two years. Influencers need to seek out every advantage in order to survive in this growing yet infamously competitive marketplace. So how can social media influencers leverage technology in new ways to maximize the value of sharing media with their followers?

Below is a sample use case of how a social media influencer can leverage the power of Google Cloud APIs to bolster their existing media library. In this use case, it is important to remember to consolidate strategies into corresponding tool sets and try to keep them under one application as to not create disparate processes and data sets. One leading solution for this is the Synaptik framework, which will be discussed later in the post.

Tools: Google Cloud Video Intelligence API + Natural Language API

The Google Video Cloud Intelligence API allows users to, for a nominal fee, search every moment of every video file in a user’s catalog and find every occurrence as well as its significance. It quickly annotates videos stored in Google Cloud Storage, and helps users identify key noun entities of each video, as well as when they occur within the video.

The Google Cloud Natural Language API provides natural language understanding technologies to developers. Examples include sentiment analysis, entity recognition, entity sentiment analysis and text annotations.

Use Case: A fitness social media influencer wants to drive more insights from his collection of workout videos while better meeting the needs his current and potential followers.

Action Steps:

1.) Identification of nouns within video content. The influencer wants to make sure that videos are tracked by workout type. He uses the Google Cloud Video Intelligence API to identify noun keywords within each video including treadmill, kettlebell, barbell and bicycle.

2.) Organization of video content. The influencer is now able to take several action steps following noun identification. He chooses to crop videos showing certain activities to share daily on social media: one day highlighting kettlebells, another for a treadmill routine. Second, he makes his YouTube videos searchable by activity so his followers and visitors to his videos can easily find specific routines within larger workouts.

3.) Tracking of follower response. Using the Google Cloud Natural Language API, the influencer can also track the positive and negative sentiment of followers towards each social media posting. This allows him to drive his digital video content strategy to better serve his customers and ultimately continue to grow his social influence!

This work can seem daunting for a social media influencer, or really any small- to medium-sized business owner. However, Synaptik has the power to integrate Google Video Cloud Intelligence, Natural Language and dozens of other Google Cloud-based APIs into one easy-to-use, customizable platform. Sign up for a 30 minute consultation and we can show you what customers are saying about your products and services across multiple social media channels online (Facebook, Twitter, LinkedIn, etc.).

By Joe Sticca

Categories
Insights

Big Data Definition, Process, Strategies and Resources

Are we at the Big Data tipping point?

The Big Data space is warming up – to the point that various experts by now perceive it as the over-hyped successor to cloud. The publicity might be a bit much, however Big Data is by now living up to its prospective, changing whole business lines, such as marketing, pharmaceutical research, and cyber-security. As a business gains experience with concrete kinds of information, certain issues tend to fade, however there will on every relevant occasion be another brand-new information source with the same unknowns awaiting in the wings. The key to success is to start small. It’s a lower-risk way to see what Big Data may do for your firm and to test your businesses’ preparedness to employ it.

In nearly all corporations, Big Data programs get their start once an executive becomes persuaded that the corporation is missing out on opportunities in data. Perhaps it’s the CMO looking to glean brand-new perceptiveness into consumer conduct from web data, for example. That conviction leads to a comprehensive and laborious procedure by which the CMOs group could work with the CIOs group to state the exact insights to be pursued and the related systematic computational analysis of data or statistics to get them.

Big Data: Find traffic bottlenecks?

The worth of Big Data for network traffic and flow analysis is in the capacity to see across all networks, applications and users to comprehend in what way IT assets, and in particular net-work bandwidth, is being dispersed and devoured. There are several tools with which customers can finally see precisely whoever is doing what on the net-work, down to the concrete application or smartphone in use. With this real-time perceptiveness, associated with prolonged term use history, clients can spot tendencies and outliers, identifying wherever performance difficulties are starting and why.

Big Data has swished into any industry and at the moment plays an essential part in productivity development and contention competition. Research indicates that the digital cluster of data, data processing power and connectivity is ripe to shake up many segments over the next 10 years.

Big Data: What type of work and qualifications?

Big Data’s artificial intelligence applications of tools and methods may be applied in various areas. For example, Google’s search and advertisement business and its new robot automobiles, which have navigated 1000s of miles of California roads, both employ a package of artificial intelligence schemes. Both are daunting Big Data challenges, parsing huge amounts of information and making decisions without delay.

A Big Data specialist should master the different components of a Hadoop ecosystem like Hadoop 2.7, Yarn, MapReduce, Pig, Hive, Impala, HBase, Sqoop, Flume, and Apache Spark. They should also get hands-on practice on CloudLabs by implementing real life programs in the areas of banking, electronic communication telecommunication, social media, insurance, and e-commerce.


Image: Erik Underwood/TechRepublic

How can the value of Big Data be defined?

The Big Data wave is altogether about detecting hidden worth in information resources. It characteristically is thought of as a large organization bringing all their different sources of information together (big and complex). Then boiling this data down to, still sizable, however a lot more controllable, data sets. This data can additionally be attacked with advanced systematic computational analysis of data or statistics, machine learning, and all types of out there mathematics. From this, brand new and unforeseen insights can be found.

Experts say that when Big Data programs disappoint, it’s frequently since businesses have not plainly described their objectives, the systematic computational analysis of data or statistics analytics problem they desire to answer, or the quantifications they’ll use to measure success. An illustration of a program with a plainly described and quantifiable objective is a retail merchant desiring to improve the precision of inventory in its stores. That lessens waste and betters profitability. Measuring before and after precision is easy; so is calculating ROI founded on the resulting increased profitability.

Big Data: Who should receive measurement reports?

The boom in the B2B Big Data market (from a sub-$100m business in 2009 to $130bn today) reflects an enterprise-led agglomerate scramble to invest in information mining, suggestive of the California gold rush, accompanied by a similar media buzz. Big Data is one of those specifications that gets flung about lots of businesses – without much of an agreement as to what it means. Technically, Big Data is whatever pool of data that is assembled from more than a single source. Not only does this trigger the technological interoperability problems that make data interchange so thwarting, but it as well makes it hard to know what information is available, what format it’s in, in what way to synthesize aged and brand-new data, and in what way to architect a practical way for end-users to communicate with Big Data tools.

In addition to the right applications of tools and methods, suppliers should invest time and manpower in obtaining the capabilities to make systematic computational analysis of data or statistics work for them. This includes crafting a committed group of specialists to supervise Big Data programs, implement and enhance software, and persuade users that those brand new strategies are worth their while. Given the extensive potential in the marketing industry, stakeholders need to create clever methods to manage the Big Data in their audience metrics. The creation of a united public metric standard is a hard, however essential objective, and stakeholders ought to strive to supply complete transparency to users with regard to tracking information as well as opt-out systems.

Robust metadata and forceful stewardship procedures as well make it simpler for corporations to query their information and get the answers that they are anticipating. The capacity to request information is foundational for reporting and systematic computational analysis of data or statistics, however corporations must characteristically overcome a number of challenges before they can engage in relevant examination of their Big Data resources. Businesses may do this by making sure that there is energetic participation and backing from one or more business leaders when the original plan of action is being elaborated and once the first implementations take place. Also of vital significance here is continuing collaboration amid the business and IT divisions. This ought to ensure that the business value of all ventures in Big Data systematic computational analysis of data or statistics are correctly comprehended.

A recent KPMG study showed only 40% of senior managers have a high level of trust in the user insights from their systematic computational analysis of data or statistics, and nearly all indicated their C-suite did not completely aid their current information analytics plan of action. 58% of organizations report that the influence of Big Data analytics on earnings was 3% or smaller. The actual Bonanza appears limited to banking, supply chains, and technical performance optimization – understandably some organizations feel left behind.

Big Data: How much value is created for each unit of data (whatever it is)?

The big part of Big Data alludes to the capacity of data accessible to examine. In the supply chain realm, that could include information from point-of-sale setups, bar-code scanners, radio frequency identification readers, global positioning system devices on vehicles and in cell phones, and software systems used to run transportation, warehousing, and additional operations.

CIOs and other Information Technology decision makers are used to needing to do more with less. In the world of Big Data, they might be able to achieve cost savings and efficiency gains, IT Ops and business intelligence (BI) strategies, exploiting advancements in open source software, distributed data processing, cloud economic science and microservices development.

Consultants who work with businesses on systematic computational analysis of data or statistics projects cite additional supply chain advancements that result from Big Data programs. For example, an online retailer that uses sales information to forecast what color sweaters sell the most at different times of the year. As a result of that data, the company at the moment has its providers create sweaters without color, then dye them later, based on consumer demand determined in near-real time.

Data experts in science and information experts as well as architects and designers with the expertise to work with Big Data applications of tools and methods are in demand and well-compensated. Want an extra edge looking for your following assignment? Get Big Data certified.

Is senior management in your organization involved in Big Data-related projects?

As with any business initiative, a Big Data program includes an element of risk. Any program may disappoint for whatever number of reasons: poor management, under-budgeting, or a lack of applicable expertise. However, Big Data projects carry their own specific risks.

The progressively rivalrous scenery and cyclical essence of a business requires timely access to accurate business data. Technical and organizational challenges associated with Big Data and advanced systematic computational analysis of data or statistics make it hard to build in-house applications; they end up as ineffective solutions and businesses become paralyzed.

Large-scale information gathering and analytics are swiftly getting to be a brand-new frontier of competitive distinction. Financial Institutions want to employ extensive information gathering and analytics to form a plan of action. Data-related threats and opportunities can be subtle.

To support Big Data efforts there are 2 fundamental types of PMOs: one that acts in an advising capacity, delivering project managers in business units with training, direction and best practices; and a centralized variant, with project managers on staff who are lent out to business units to work on projects. How a PMO is organized and staffed depends on a myriad of organizational circumstances, including targeted objectives, customary strengths and cultural imperatives. When deployed in line with an organization’s intellectual/artistic awareness, PMOs will help CIOs provide strategic IT projects that please both the CFO and internal clients. Over time, and CIOs ought to permit 3 years to obtain benefits, PMOs can save organizations money by enabling stronger resource management, decreasing project failures and supporting those projects that offer the largest payback.

Next, get started with the Big Data Self-Assessment:

The Big Data Self-Assessment covers numerous criteria related to a successful Big Data project – a quick primer eBook is available for you to download, the link is at the end of this article. In the Big Data Self Assessments, we find that the following questions are the most frequently addressed criteria. Here are their questions and answers.

The Big Data Self-Assessment Excel Dashboard shows what needs to be covered to organize the business/project activities and processes so that Big Data outcomes are achieved.

The Self-Assessment provides its value in understanding how to ensure that the outcome of any efforts in Big Data are maximized. It does this by securing that responsibilities for Big Data criteria get automatically prioritized and assigned; uncovering where progress can be made now.

To help professionals architect and implement the best Big Data practices for your organization, Gerard Blokdijk, head and author of The Art of Service’s Self Assessments provides a quick primer of the 49 Big Data criteria for each business, in any country, to implement them within their own organizations.

Take the abridged Big Data Survey Here:

Big Data Mini Assessment

Get the Big Data Quick Exploratory Self-Assessment eBook:

https://189d03-theartofservice-self-assessment-temporary-access-link.s3.amazonaws.com/Big_Data_Quick_Exploratory_Self-Assessment_Guide.pdf

by Gerard Blokdijk

Categories
Insights

Resiliency Tech: A Signal in the Storm

Redundancy is a four-letter word in most settings, but when it comes to emergency management and disaster relief, redundant systems reduce risk and saves lives. Tropical Storm Harvey caused at least 148,000 outages for internet, tv and phone customers, making it impossible for people to communicate over social media and text. In this blog post, we explore innovative ways smart cities can leverage big data and Internet of Things (IoT) technology to MacGyver effective solutions when go-to channels breakdown.

Flood Beacons

Designer Samuel Cox created the flood beacon to share fast and accurate flood condition information. Most emergency management decisions are based on forecasts and person-to-person communications with first responders and people in danger. With the flood beacon, you can find out water levels, GPS coordinates and water movements in real-time. The beacon is designed to have low power requirements and use solar to stay charged up. Now, it will be up to the IoT innovators of the world to turn the flood beacon into a complete solution that can broadcast emergency center locations and restore connectivity to impacted areas.

EMS Drones

The Health Integrated Rescue Operations (HiRO) Project has developed a first responder drone that can drop medical kits, emergency supplies and Google Glass for video conference communication. “EMS response drones can land in places that EMS ground vehicles either cannot get to or take too long to reach”, says Subbarao, a recognized expert in disaster and emergency medicine. “Immediate communications with the victims and reaching them rapidly with aid are both critical to improve outcomes.” – One of These Drones Could Save Your Life – Jan.12.2017 via NBC News.

Big Data Analytics and Business Intelligence

Emergency management agencies and disaster relief organizations have been using crowdsourcing and collaborative mapping tools to target impact areas but poor data quality and the lack of cross-agency coordination continue to challenge the system. Business intelligence platforms that provide access to alternative data sets and machine learning models can help government agencies and disaster relief organizations corroborate and collaborate. By introducing sentiment analysis, keyword search features and Geotags, organizations can quickly identify high-need areas. Furthermore, BI platforms with project management and inventory plug-ins can aggregate information and streamline deployment.

Smart emergency management systems must be flexible, redundant and evolve with our technology. At True Interaction, we believe that traditional private sector business intelligence tools and data science capabilities can help cross-agency collaboration, communication and coordination. Our core team of software developers is interested in teaming up with government agencies, disaster relief organizations and IoT developers to create better tools for disaster preparation and relief service delivery. Contact us here if you are interested in joining our resiliency tech partnership!

Categories
Insights

Blockchain 101 Self-Assessment

Blockchain is the new black. We’ve heard the term in conference calls, seen it on the cover of magazines and know it’s a hot topic on CNBC but the barrage of information makes it difficult to distinguish hype from reality. It’s clear that Blockchain will revolutionize the world but understanding how is mission critical. In this blog post we’ll cover the Blockchain essentials and the most frequently asked questions we’ve come across.

At The Art of Service we’ve developed a Blockchain self-assessment tool that professionals use to test the depth of their knowledge on the Blockchain concept and its potential. The Blockchain self-assessment covers numerous criteria related to a successful project – a quick primer version is available for you to download at the end of the article.

BLOCKCHAIN FREQUENTLY ASKED QUESTIONS:

What Is the Blockchain?

The problem with nearly all Blockchain explanations is that they supply too much detail upfront and use lingo that winds up leaving folks more confused than when they started. We are in the nascent stages of this technological revolution and it’s hard to predict how Blockchain will impact our institutions and our lives. Brand new Blockchain-related technologies are being built every day and the framework is evolving.

Here are some key definitions and ideas to help you understand the fundamental pillars behind this insurgent technology:

1. Blockchain is a technology that essentially disperses an account ledger. For those of you in the monetary management world, you know an account ledger as the trusted source of transactions or facts. The same is true with Blockchain but in lieu of existing in a great buckskin bound book or in a financial management program, Blockchains are run by a dispersed set of information handling resources working together to maintain that account ledger.
2. The Blockchain procedure of securely and permanently time-stamping and recording all transactions makes it very hard for a user to change the account book once a block in a Blockchain has been added.
3. Private Blockchains allow for distributing identical copies of an account book but only to a restricted amount of trusted contributors. This set of techniques, practices, procedures and rules is better suited for applications needing simplicity, speed, and greater clarity.
4. Users of the Distributed Account Ledger Technology (DLT) notably benefit from the efficiencies by generating a more robust ecosystem for real-time and secure data sharing.
5. Blockchain is only one of the various kinds of data constructions that provide secure and valid achievement of distributed agreement. The Bitcoin Blockchain, which uses Proof-of-Work mining, is the most common approach being used today. However, additional forms of DLT consensus exist such as Ethereum, Ripple, Hyperledger, MultiChain and Eris.

Blockchain: Who controls the risk?

Each party on a Blockchain has access to the entire database and its complete past. No single party controls the data or the information. Every party can substantiate the records of its transaction associates directly, without a mediator.

For public businesses, the conditions of Blockchain are very different. The identity of contributors must be known while permissioned Blockchains require no evidence of work. Over the next few years, Blockchain growing pains will hit the industry and support systems will begin to take shape. Today, Blockchain needs supporting infrastructure available for cloud or traditional database setups – there are no systems management tools, reporting tools or legacy configuration integrations in place.

Could Blockchain be the structural change the market needs?

Blockchain’s foundational technology is the biggest innovation computer science has seen in a long time. The thought of a dispersed database where trust is established through mass collaboration and clever code rather than a powerful institution is game-changing. Now it will be up to the larger business community to determine whether it will become the building block for the digitized economy or if it will be disregarded and perish. Now, building formidable and trustworthy Blockchain standards is the next step to turn this global opportunity into a reality.

Blockchain: What does the future hold?

There are many Blockchain and distributed account ledger setups emerging in the market including: BigchainDB, Billon, Chain, Corda, Credits, Elements, Monax, Fabric, Ethereum, HydraChain, Hyperledger, Multichain, Openchain, Quorum, Sawtooth, Stellar. The Block chain use cases span a number of industries including insurance, healthcare and finance but we are only scratching the surface of what’s possible.

Next, get started with the Blockchain Self-Assessment:

The Blockchain Self-Assessment Excel Dashboard provides a way to gauge performance against planned project activities and achieve optimal results. It does this by ensuring that Blockchain criteria are automatically prioritized and assigned; uncovering where progress can be made now; and what to plan for in the future.

To help professionals architect and implement best Blockchain practices for your organization, Gerard Blokdijk, author of The Art of Service’s Self Assessments provides a quick primer of the 49 Blockchain criteria for any business in any country.

Get the Blockchain Quick Exploratory Self-Assessment eBook here:

https://189d03-theartofservice-self-assessment-temporary-access-link.s3.amazonaws.com/Blockchain_Quick_Exploratory_Self-Assessment_Guide.pdf

About the Author

Gerard Blokdijk is the CEO of The Art of Service. He has been providing information technology insights, talks, tools and products to organizations in a wide range of industries for over 25 years. Gerard is a widely recognized and respected information specialist. Gerard founded The Art of Service consulting business in 2000. Gerard has authored numerous published books to date.

By Gerard Blokdijk

Categories
Insights

Can Artificial Intelligence Catalyze Creativity?

In the 2017 “cerebral” Olympic games, artificial intelligence defeated the human brain in several key categories. Google’s AlphaGo beat the best player of Go, humankind’s most complicated strategy game; algorithms taught themselves how to predict heart attacks better than the AHA (American Heart Association); and Libratus, an AI built by Carnegie Mellon University, beat four top poker players at no-limit Texas Hold ‘Em. Many technologists agree that computers will eventually outperform humans on step-by-step tasks, but when it comes to creativity and innovation, humans will always be a part of the equation.

Inspiration, from the Latin inspiratus, literally means “breathed into.” It implies a divine gift – the aha moment, the lightning bolt, the secret sauce that can’t be replicated. Around the globe, large organizations are attempting to reculture their companies to foster innovation and flexibility, two core competencies needed to survive the rapid-fire rate of change. Tom Agan’s HBR article titled “The Secret to Lean Innovation” identified learning as the key ingredient, while Lisa Levey believes that seeing failure as a part of success is key.

At the same time, although innovation is a human creation, machines do play a role in that process. Business leaders are using AI and advanced business intelligence tools to make operations more efficient and generate higher ROI, but are they designing their digital ecosystems to nurture a culture of innovation? If the medium is the message, then they should be.

“If you want to unlock opportunities before your competitors, challenging the status quo needs to be the norm, not the outlier. It will be a long time if ever before AI replaces human creativity, but business intelligence tools can support discovery, collaboration and execution of new ideas.” – Joe Sticca, COO at Synaptik

So, how can technology augment your innovation ecosystem?

Stop

New business intelligence tools can help you manage innovation, from sourcing ideas to generating momentum and tracking return on investment. For instance, to prevent corporate tunnel vision, you can embed online notifications that superimpose disruptive questions on a person’s screen. With this simple tool, managers can help employees step outside the daily grind to reflect on the larger questions and how they impact today’s deliverable.

Collaborate

The market is flooded with collaboration tools that encourage employees to leverage each other’s strengths to produce higher quality deliverables. The most successful collaboration tools are those that seamlessly fit into current workflows and prioritize interoperability. To maximize innovation capacity, companies can use collaboration platforms to bring more diversity to the table by inviting external voices including clients, academics and contractors into the process.

Listen

Social listening tools and sentiment analysis can provide deep insights into the target customer’s needs, desires and emotional states. When inspiration strikes, innovative companies are able to prototype ideas quickly and share those ideas with the digital universe to understand what sticks and what stinks. By streamlining A/B testing and failing fast and often, agile companies can reduce risk and regularly test their ideas in the marketplace.

While computers may never birth the aha moments that drive innovation, advanced business intelligence tools and AI applications can capture sparks of inspiration and lubricate the creative process. Forward-thinking executives are trying to understand how AI and advanced business intelligence tools can improve customer service, generate higher ROI, and lower production costs. Companies like Cogito are using AI to provide real-time behavioral guidance to help customer service professionals improve the quality of their interactions while Alexa is using NLP to snag the full-time executive assistant job in households all over the world.

Creativity is the final frontier for artificial intelligence. But rather than AI competing against our innovative powers, business intelligence tools like Synaptik can bolster innovation performance today. The Synaptik difference is an easy user interface that makes complex data management, analytics and machine learning capabilities accessible to traditional business users. We offer customized packages that are tailored to your needs and promise to spur new ideas and deep insights.

By Nina Robbins

Categories
Insights

Improving the Fan Experience through Big Data and Analytics

As consumer electronics companies produce bigger and better HD televisions, sports fans have enjoyed the ability to feel the excitement of the stadium from the comfort of their own homes. Broadcast companies like NBC, FOX, CBS and ESPN have further enhanced the viewing experience by engaging fans on social media platforms and producing bingeworthy content. The downside of high ratings are stagnating stadium attendance levels.

With the convenience of the at-home viewing experience, how can professional sport leagues bring fans back to the stadium? In a 1998 poll conducted by ESPN, 54% of fans revealed that they would rather be at a game than at home. However, when that poll was taken again in 2012, only 29% of fans wanted to be at the game.

Now, professional football teams are betting big data can provide insights that will help them get fans back in the seats. For instance, The New England Patriots have partnered with data science experts to better understand the needs of their fanbase. By investing in big data and high-power analytics tools, the New England Patriots are uncovering new insights on consumer behavior such as in-store purchases, ticket purchase information, and click rates – information that will help them optimize marketing and sales tactics.

While most Patriot games do sellout, there are instances where season ticket holders do not show up. With tools from Kraft Analytics Group (KAGR), The New England Patriots can access data from every seat in the stadium to see who will be attending and how many season ticket holders came to the game. By tracking all of this data the New England Patriots are able to uncover insights into their fanbase that were previously unknown. Robert Kraft, owner of the New England Patriots, was asked about fan turnout and how valuable it was for the team.

If somebody misses a game, they get a communication from us and we start to aggregate the reasons why people miss one, two, or three games. At the end of the year, I can know everything that took place with our ticket-holders during that season. It’s incredibly valuable to adjust your strategy going forward depending on what your goals are.“-Robert Kraft, Owner of the New England Patriots

Many teams are also turning to IoT (Internet of Things) solutions to optimize their fan experience. With IoT solutions, devices can be connected to the internet with a click of a button. Professional sports teams have taken advantage of these opportunities by using platforms such as iBeacon. This app uses bluetooth connections in order to connect with mobile devices to create a new type of stadium experience. With this technology connecting to concession stands and areas around the ballpark, fans can find the closest pizza discount and the shortest bathroom line.

Beacon Stadium App
Beacon Stadium App-Courtesy of Umbel

IoT stadiums will eventually become the new norm. The San Fransisco Giants have become leaders in the revolution. Bill Schlough CIO of the San Fransisco Giants commented on this trend,

“Mobile and digital experiences are paramount to our fan experience,” according to Schlough, “and they have played a role in the fact that we’ve had 246 straight sellouts.”

Schlough and the Giants organization have taken an active role to offer their fans a unique viewing experience. Cell phone coverage was introduced in the early 2000s, and in 2004 they introduced a plan to make AT&T Park a mobile hotspot. With WiFi antennas across the stadium, fans have the ability to watch videos and use social media to interact with other fans in the stadium.

As owners and cities continue to spend billions of dollars for new stadiums, meeting consumer demand will be crucially important in a digital world. Teams like the New England Patriots and the San Fransisco Giants have already started using technological tools like analytics and the Internet of Things in order to cater to the needs of their fans. With more innovators in the tech industry, other sports teams will likely follow the path of the Patriots and Giants in order to provide a memorable experience at the game for their customers.

With Synaptik’s social listening tools and easy data management integration, companies have the advantage to track conversations and data around secific topics and trends. Sign up for a 30 minute consultation.

Contributors:

Joe Sticca, Chief Operating Officer at True Interaction

Kiran Prakash, Content Marketing at True Interaction

Categories
Insights

Sparking Digital Communities: Broadcast Television’s Answer to Netflix

In the late 1990s and early 2000s network television dominated household entertainment. In 1998, nearly 30% of the population in the United States tuned into the NBC series finale of “Seinfeld”. Six years later, NBC’s series finale of the popular sitcom “Friends” drew 65.9 million people to their television screen, making it the most watched episode on US network TV in the early aughts. Today, nearly 40% of the viewers that tuned into the “Game of Thrones” premier viewed the popular show using same-day streaming services and DVR playback. The way people watch video content is changing rapidly and established network television companies need to evolve to maintain their viewership.

While linear TV is still the dominant platform amongst non-millenials, streaming services are quickly catching up. As young industry players like Hulu, Netflix and Youtube transform from streaming services to content creators and more consumers cut ties with cable, established network broadcasters need to engage their loyal audience in new ways. The challenge to stay relevant is further exacerbated by market fragmentation as consumer expectations for quality content with fewer ad breaks steadily rise.


Courtesy of Visual Capitalist

One advantage broadcast television still has over streaming services is the ability to tap into a network of viewers watching the same content at the same time. In 2016, over 24 million unique users sent more than 800 million TV related tweets. To stay relevant, network television companies are hoping to build on this activity by making the passive viewing experience an active one. We spoke with Michelle Imbrogno, Advertising Sales Director at This Old House about the best ways to engage the 21st century audience.

“Consumers now get their media wherever and whenever it’s convenient for them. At “This Old House”, we are able to offer the opportunity to watch our Emmy Award winning shows on PBS, on thisoldhouse.com or youtube.com anytime. For example, each season we feature 1-2 houses and their renovations. The editors of the magazine, website and executive producer of the TV show work closely together to ensure that our fans can see the renovations on any platforms. We also will pin the homes and the items in them on our Pinterest page. Social media especially Facebook resonates well with our readers.“– Michelle Imbrogno, Advertising Sales Director, This Old House

Social media platforms have become powerful engagement tools. According to Nielsen’s Social Content Ratings in 2015, 60% of consumers are “second screeners” – using their smartphones or tablets while watching TV. Many “second screeners” are using their devices to comment and interact with a digital community of fans. Games, quizzes and digital Q & A can keep viewers engaged with their favorite programming on a variety of platforms. The NFL is experimenting with new engagement strategies and teamed up with Twitter in 2016 to livestream games and activate the digital conversation.

“There is a massive amount of NFL-related conversation happening on Twitter during our games and tapping into that audience, in addition to our viewers on broadcast and cable, will ensure Thursday Night Football is seen on an unprecedented number of platforms.”-NFL Commissioner Roger Goodell ,”

With social media optimization (SMO) software, television networks can better understand their audience and adjust their social media strategy quickly. Tracking website traffic and click rates simply isn’t enough these days. To stay on trend, companies need to start tracking new engagement indicators using Synaptik’s social media intelligence checklist:

Step 1: Integrate Social Listening Tools

The key to understanding your audience is listening to what they have to say. By tracking mentions, hashtags and shares you can get a better sense of trending topics and conversations in your target audience. Moreover, this knowledge can underpin your argument for higher price points in negotiations with media buyers and brands.

Step 2: Conduct a Sentiment Analysis

Deciphering a consumer’s emotional response to an advertisement, character or song can be tricky but sentiment analysis digs deeper using natural language processing to understand consumer attitudes and opinions quickly. Additionally, you can customize outreach to advertisers based on the emotional responses they are trying to tap into.

Step 3: Personality Segmentation

Understanding a consumer’s personality is key to messaging. If you want to get through to your audience you need to understand how to approach them. New social media tools like Crystal, a Gmail plug-in, can tell you the best way to communicate with a prospect or customer based on their unique personality. This tool can also help you customize your approach to media buyers and agents.

By creating more accessible content for users and building a digital community around content, television networks can expect to increase advertising revenue and grow their fan base. With Synaptik’s social listening tools, companies have the advantage to track conversations around specific phrases, words, or brands. Sign up for a 30 minute consultation and we can show you what customers are saying about your products and services across multiple social media channels online (Facebook, Twitter, LinkedIn, etc.).

Contributors:

Joe Sticca, Chief Operating Officer at True Interaction

Kiran Prakash, Content Marketing at True Interaction

by Nina Robbins

Categories
Insights

Real Estate: Climate-proof your Portfolio

The real estate industry is built on the power to predict property values. With sea levels on the rise, smart investors are thinking about how to integrate climate science into real estate projections. Complex algorithms and regression models are nothing new to developers and brokerage firms but the rapidly evolving data ecosystem offers breakthrough opportunities in resiliency marketing, valuation and forecasting.

In Miami, investors are starting to look inland for property deals on higher ground. According to a New York Times article by Ian Urbina, “home sales in flood-prone areas grew about 25% less quickly than in counties that do not typically flood.” To get in front of the wave, real estate investors and appraisers need to regularly update their forecasting models and integrate new environmental and quality of life data sets. Third party data can be expensive but as municipal governments embrace open data policies, costs may go down.

Today, no fewer than 85 cities across the U.S. have developed open data portals that include data on everything from traffic speed to air quality to SAT results. Real estate professionals are using data to do more than just climate-proof their portfolios. With high-powered business intelligence tools, businesses can turn this rich raw data into better insights on:

Home Valuation

Zillow, an online real estate marketplace is leading the charge on better home valuation data models. The company’s ‘zestimate’ tool is a one-click home value estimator based on 7.5 million statistical and machine learning models that analyze hundreds of data points on each property. Now, they’ve launched a $1 million dollar prize competition calling on data scientists to create models that outperform the current Zestimate algorithm.

Design

According to the Census Bureau, in 1960, single-person households made up about 13% of all American households. Now, that number has jumped to 28% of all American households. Additionally, a survey by ATUS cited in a Fast Company article by Lydia Dishman revealed that the number of people working from home increased from 19% in 2003 to 24% in 2015. The rapid rate of technological change means a constant shift in social and cultural norms. The micro-apartment trend and the new WeLive residential project from WeWork are signs of changing times. For developers, the deluge of data being created by millennials provides incredible insight into the needs and desires of tomorrow’s homebuyers.

Marketing

Brokerage firms spend exorbitant amounts of money on marketing but with big data in their pocket, real estate agents can narrow in on clients ready to move and cut their marketing spend in half. According to this Wall Street journal article by Stefanos Chen, saavy real estate agents use data sources like grocery purchases, obituaries and the age of children in the household to predict when a person might be ready to upsize or downsize. This laser-sharp focus allows them to spend their marketing budget wisely and improve conversion rates across the board.

In today’s competitive marketplace, real estate professionals need a self-service data management and analytics platform that can be applied to any use case and doesn’t require advanced IT skills. Synaptik is designed to adapt to your needs and can easily integrate quantitative and qualitative data from websites, social media channels, government databases, video content sites, APIs and SQL databases. Real estate is big business and better intelligence mean better returns. Sign up for a demo and find answers to questions you didn’t even know to ask.

By Nina Robbins