Categories
Insights

3 Issues with Data Management Tools

The market is currently awash with BI tools that advertise lofty claims regarding their ability to leverage data in order to ensure ROI. It is evident, however, that these systems are not created equally and the implementation of one could adversely affect an organization.

Cost

While consistent multifold increases of the digital universe is ushering in lower costs for data storage, a decline reported to be as much as 15-20 percent in the last few years alone, it is also the catalyst for the the rising cost of data management. It seems that the cause for concern regarding data storage does not lie in the storage technologies themselves, but in the increasing complexity of managing data. The demand for people with adequate skills within the realm of data management is not being sufficiently met, resulting in the need for organizations to train personnel from within. The efforts required to equip organizations with the skills and knowledge to properly wield these new data management tools demand a considerable portion of a firm’s time and money.

Integration

The increased capacity of a new data management system could be hindered by the existing environment if the process of integration is not handled with the proper care and supervision. With the introduction of a different system into a company’s current technological environment as well as external data pools( i.e. digital, social, mobile, devices, etc.), the issue of synergy between the old and new remains. CIO identifies this as a common oversight and advises organizations to remain cognizant of how data is going to be integrated from different sources and distributed across different platforms, as well as closely observe how any new data management systems operate with existing applications and other BI reporting tools to maximize insight extracted from the data.

Evan Levy, VP of Data Management Programs at SAS, shares his thoughts on the ideal components of an efficient data management strategy as well as the critical role of integration within this process, asserting that:

“If you look at the single biggest obstacle in data integration, it’s dealing with all of the complexity of merging data from different systems… The only reasonable solution is the use of advanced algorithms that are specially designed to support the processing and matching of specific subject area details. That’s the secret sauce of MDM (Master Data Management).”

Reporting Focus

The massive and seemingly unwieldy volume is one major concern amidst this rapid expansion of data, the other source of worry being that most of it is largely unstructured. Many data management tools offer to relieve companies of this issue by scrubbing the data clean and meticulously categorizing it. The tedious and expensive process of normalizing, structuring, and categorizing data does admittedly carry some informational benefit and can make reporting on the mass of data much more manageable. However, in the end, a lengthy, well-organized report does not guarantee usable business insight. According to research conducted by Gartner, 64% of business and technology decision-makers have difficulty getting answers simply from their dashboard metrics. Many data management systems operate mostly as a visual reporting tool, lacking the knowledge discovery capabilities imperative to producing actionable intelligence for the organizations that they serve.

The expenses that many of these data management processes pose for companies and the difficulties associated with integrating them with existing applications may prove to be fruitless if they are not able to provide real business solutions. Hence, data collection should not be done indiscriminately and the management of it conducted with little forethought. Before deciding on a Business Intelligence system, it is necessary to begin with a strategic business question to frame the data management process in order to ensure the successful acquisition and application of big data, both structured and unstructured.

Joe Sticca, Chief Operating Officer of True Interaction, contributed to this post.

By Justin Barbaro

Categories
Insights

Ensure Data Discovery ROI with Data Management

The explosion of data is an unavoidable facet of today’s business landscape. Domo recently released its fourth annual installment of its Data Never Sleeps research for 2016, illustrating the amount of data that is being generated in one minute on a variety of different platforms and channels. The astounding rate in which data has been growing shows no indication of slowing down, anticipating a digital universe saturated in nearly 44 trillion gigabytes of data by the year 2020. With data being produced in an increasingly unprecedented rate, companies are scrambling to set data management practices in place to circumvent the difficulties of being overwhelmed and eventually bogged down by the deluge of information that should be informing their decisions. There are a plethora of challenges with calibrating and enacting an effective data management strategy, and according to Experian’s 2016 Global Data Management Benchmark Report, a significant amount of these issues are internal.

Inaccurate Data

Most businesses strive for more data-driven insights, a feat that is rendered more difficult by the collection and maintenance of inaccurate data. Experian reports that 23% of customer data is believed to be inaccurate. While over half of the companies surveyed in this report attribute these errors to human error, a lack of internal manual processes, inadequate data strategies, and inadequacies in relevant technologies are also known culprits in the perpetuation of inaccurate data. While the reason for the erroneous input of data is still largely attributed to human oversight, it is the blatant lack of technological knowledge and ability that is barring many companies from leveraging their data, bringing us to our next point.

Data Quality Challenges

Inescapable and highly important, the sheer volume of information being generated by the second warrants a need for organizations to improve data culture. Research shows that businesses face challenges in acquiring the knowledge, skills, and human resources to manage data properly. This is reflective of organizations of all sizes and resources, not just large companies, as a baffling 94% of surveyed businesses admit to having experienced internal challenges when trying to improve data quality.

Reactive Approach

Experian’s data sophistication curve identifies four different levels of data management sophistication based on the people, processes, and technology associated with the data: unaware, reactive, proactive, and optimized. While the ultimate goal is ascending to the optimized level of performance, 24% of the polled businesses categorize their data management strategies as proactive, while the majority (42%) admits to merely reaching the reactive level of data management sophistication. The reactive approach is inefficient in many ways, a prominent one being the data management difficulties, both internal and external, espoused by waiting until specific issues with data crops up before addressing and fixing them as opposed to detecting and resolving such problems in a timely manner.

The most deleterious disadvantage of failing to address these pressing issues as they are detected is the careless neglect of invaluable business insight that is concealed in the mass of available data. Data management systems that have not optimized their operations will not be able to process data to produce relevant information in a timely manner. The lack of machine learning mechanisms within these sub-optimal systems will hinder businesses in their knowledge discovery process, barring organizations from making data-driven decisions in real time.

Denisse Perez, Content Marketing Analyst for True Interaction, contributed to this post.

by Joe Sticca

Categories
Insights

Engaging and Keeping Your Most Valuable Mobile Customers

Retention

According to WhaTech, some 80-90% of apps downloaded from app stores are only opened once before being uninstalled. This has been the norm for years. That’s quite the endorsement for a proper preload with the business requirement that the user cannot remove the app. Facebook has found only about 6% of apps it works with are still being used 30 days post install. KISSmetrics found that it can be up to 7x more expensive to acquire a new user than retain a current user. Acquisition campaigns are structured with retention in mind. This requires a keen understanding of the lifetime value (LTV) of one’s customer segments.

Who are your avid consumers? Which customers consistently purchases your higher tier content? Who consistently subscribes to your services? You can use Synaptik to segment these buckets. Over the course of the next three years, programatic will likely become the only way to engage them. Firms like Element Wave focus on user engagement and retention via real time, precision targeting at key moments during games or app navigation. They utilize native messaging and well timed push messages to drive users back to the app with incentives and contextual information such as how many persons in a geo-fenced region are placing a bet or attempting to bid on an item. Whether it’s gaming or non-gaming apps on Android or iOS, the latest study by Localytics shows Facebook is still the most valuable platform for targeting.

“Retention has transcended to becoming a highly objective metric to measure how users find your product or service,” says WhaTech. “Market research suggests that even a 5 percent increase in customer retention can lead to increasing profits by 25 percent to 125 percent.”

Customer Acquisition Cost

How much is your firm spending on acquisition costs for mobile ad campaigns? You need to look at cost per install and cost per registration as well as what your conversion rate is to get a customer to go from installing your app to registering to be a user. There’s a big drop off point here in mobileUX, which is why single sign on is huge. As a mobile product manager, I found that best way to convert installs to purchases was to deliver the experience of the application as a deferred sign-on without requiring initial registration. I often find myself deleting apps if the path to use is too cumbersome and requires too many fields to be filled out.

Obviously, the total cost per acquisition is a key figure, but we just want to make sure it’s less than the life time value of that acquisition. You decide what your margin needs to be. What’s your model look like for 2017? Let us know if we can be of assistance with the development of your mobile app or website. Do you need some help in getting your app to market or finding a way to get it preloaded onto the newest mobile handsets and tablets? Reach out. True Interaction has a team of mobile experts. At True Interaction, we stay ahead of the curve and help you do the same.

by David Sheihan Hunter Lindez

Categories
Insights

Digital Transformation Capability and the Modern Business Landscape

Yesterday morning, The Wall Street journal announced that Goldmann Sachs Group Inc. dropped out of R3CEV LLC blockchain group. R3 has been notable in its corralling of 70 different banks and financial firms to join their group since 2014, including Bank of America, J.P. Morgan and State Street. A spokesperson commented on the company’s departure:

Developing technology like this requires dedication and significant resources, and our diverse pool of members all have different capacities and capabilities which naturally change over time.

For the record, Goldmann Sachs will continue to invest in blockchain technology including the startups Circle and Digital Asset Holdings, but there is only speculation as to exactly why Goldmann Sachs’ membership with R3 expired. Certainly it may have been related to disagreement as to the equity distribution models between R3 and its members, but just a month earlier, when R3 announced their blockchain proof-of-concept prototype exercise, R3 CEO David Rutter commented:

Quality of data has become a crucial issue for financial institutions in today’s markets. Unfortunately, their middle and back offices rely on legacy systems and processes – often manual – to manage and repair unclear, inaccurate reference data.

The truth is, that there’s still quite a bit of latitude of digital capability across, within, and without businesses, big or small.

Getting the whole gang onboard

Perhaps Goldmann Sachs’ departure is due to exactly this: some aspect of their business units are behind the power curve in their digitization transformation and data management efforts.

Digital transformation can be a painstakingly complicated process, partially because, according to Computer Weekly, some parts of the transformation process aren’t even executed by the organization itself, yet still require all the vigilance their CIO and IT units can muster, being ultimately their responsibility:

Companies of all kinds are increasingly using technology partners, channel partners, contract manufacturers, warehousing and logistics partners, service partners and other outside services to handle all or part of a business process. Most enterprises come to view these partners as the extended enterprise, and look for ways to have tight integration and collaboration with them.

To achieve effective, successful transformation, digital business leaders must get their whole business ecosystem onboard with a clear, discernable, comprehensive strategic digital transformation plan that touches upon all of the extended enterprise. To act and assess digital transformation opportunity, McKinsey suggests 4 steps:

1. Estimate the value at stake. Companies need to get a clear handle on the digital-sales and cost-reduction opportunities available to them. Digital—and digitally influenced—sales potential should be assessed at the product level and checked against observed internal trends, as well as competitor performance. On the cost side, administrative and operational processes should be assessed for automation potential, and distribution should be rightsized to reflect digital-sales growth. The aggregate impact should be computed and turned into a granular set of digital targets to monitor progress and drive value capture.

2. Prioritize. Most organizations don’t have the ability, resources, or risk tolerance to execute on more than two or three big opportunities at any one time. Be selective. Figure out what areas are likely to deliver the greatest return on investment and the best customer outcomes and start there. While digital requires some experimentation, too many ad hoc demos and showcases lead to scattershot investments that fail to deliver sustained value. One retailer, for instance, ended up with 25 subscale digital offerings by not culling in the right places.

3. Take an end-to-end view. One financial-services firm built a world-class digital channel but failed to update the paper-based processes that supported it—processes that were prone to error. That false veneer of speed and efficiency eroded trust and turned off customers. The moral? Although it may seem counterintuitive, overinvestment in a slick front end that is not matched with the corresponding high-quality fulfillment that customers now expect may actually lead to increased customer frustration.

4. Align the business portfolio accordingly. In the long run, some lines of business will simply be destroyed by digital. Hanging on and tweaking them is futile. Companies need to act purposefully and divest where it makes sense, identifying what holdings are likely to be cannibalized or likely to underperform in the new environment and sloughing them off. Conversely, some areas will clearly need new capabilities and assets, which companies often do not have the luxury to build up organically over time. One retailer used targeted acquisitions to rapidly build out its e-commerce capabilities, allowing it to focus on defining strategy and aspirations rather than tinkering with the “plumbing.” Source.

Creating new monetizeable value

A recent report by Gartner revealed that often organizations are missing out on a bevy of monetizeable value due to overemphasizing traditional silos and markets (marketing, social media, mobile applications, etc.). A too-narrow focus means organizations are getting only a small share of the full value that digital transformation can provide. Saul Judah, research director at Gartner says,

All too often IT leaders focus value creation more narrowly, with the result that most digital initiatives are aimed at operational improvements, rather than value transformation. While this tactical approach to digital value can result in very real process and financial improvements, the greatest potential for digital value lies in more strategic initiatives, such as creating new markets, empowering employees, changing the basis of competition and crossing industry boundaries.

IT leaders need to work with the business side of the house to identify and exploit these high-value initiatives.

Algorithms and analytics offer accelerators of value and are themselves of exchangeable and monetizable value. An analytics process may use algorithms in its creation, which could also be monetizable through an algorithmic marketplace, making it available to enterprises of all types and sizes to use.

For example, True Interaction’s data agnostic machine learning analytics platform, SYNAPTIK, is rolling out a data marketplace where organizations can syndicate and distribute new data revenue opportunities and actions to their clients, as well as other platforms.

Digital transformation and the modern enterprise landscape

The Blockchain endgame?

Blockchain technology offers several benefits to an organization. The technology uses new methods of encryption which enables anonymous sharing of information in a data-rich environment. They are further characterised as Smart Contracts, computer protocols that facilitate, verify, or enforce the negotiation of contract cases or terms. And with blockchain, the dataset remains updated and intact at all times, without the need or use of a central governing authority.

Decentralised systems using blockchain technology can manage the data relationships and sequence of events where all parties share the same data source. Furthermore, with the impending intersection of the Internet of Things with blockchain technology, digitally tenacious organizations will soon be able to connect, conceivably, anything with anything, and get them to communicate intelligently and securely. Enterprises that embrace this phenomenon will be able to provide a better user experience and value-added services, as well as gain competitive advantage and differentiation.

Seeing the rumble between Goldmann Sachs and R3 shows us that we are still a ways off as far as describing the exact standards of blockchain in business. With certainty, some markets need to topple age-old paradigms of strategic thinking that are no longer relevant in a digital world. But the promise is quite exciting.

by Michael Davison

Categories
Insights

Wrangling Data for Compliance, Risk, and Regulatory Requirements

N.B. This article addresses the financial services industry, however, the insight and tips therein are applicable to nearly any industry today. ~EIC)

The financial services industry has always been characterized by its long list of compliance, risk, and regulatory requirements. Since the 2008 financial crisis, the industry is more regulated than ever, and as organizations undergo digital transformation and financial services customers continue to do their banking online, the myriad of compliance, risk, and regulatory requirements for financial institutions will only increase from here. In a related note, organizations are continuing to invest in their infrastructure to meet these requirements. IDC Financial Insights forecasts that the worldwide risk information technologies and services market will grow from $79 billion in 2015 to $96.3 billion in 2018.

All of this means reams of data. Financial firms by nature produce enormous amounts of data, and due to compliance requirements, must be able to store and maintain more data than ever before. McKinsey Global Institute reported in 2011 that the financial services industry has more digitally stored data than any other industry.

To succeed in todays financial industry, organizations need to take a cumulative, 3-part approach to their data:

1. Become masters at data management practices.

This appears obvious, but the vast amount of compliance, risk, and regulatory requirements necessitate that organizations become adept at data management. Capgemini identified 6 aspects to data management best practices:

Data Quality. Data should be kept optimal through periodic data review, and all standard dimensions of data quality– completeness, conformity, consistency, accuracy, duplication, and integrity must be demonstrated.

Data Structure. Financial services firms must decide whether their data structure should be layered or warehoused. Most prefer to warehouse data.

Data Governance. It is of upmost importance that financial firms implement a data governance system that includes a data governance officer that can own the data and monitor data sources and usage.

Data Lineage. To manage and secure data appropriately as it moves through the corporate network, it needs to be tracked to determine where it is and how it flows.

Data Integrity. Data must be maintained to assure accuracy and consistency over the entire lifecycle, and rules and procedures should be imposed within a database at the design stage.

Analytical Modeling. An analytical model is required to parcel out and derive relevant information for compliance.

2. Leverage risk, regulatory, and compliance data for business purposes.

There is a bright side to data overload; many organizations aren’t yet taking full advantage of the data they generate and collect. According to PWC, leading financial institutions are now beginning to explore the strategic possibilities of the risk, regulatory, and compliance data they own, as well as how to use insights from this data and analyses of it in order to reduce costs, improve operational efficiency, and drive revenue.

It’s understandable that in today’s business process of many financial institutions, the risk, regulatory, and compliance side of the organization do not actively collaborate with the sales and marketing teams. The tendency toward siloed structure and behavior in business make it difficult to reuse data across the organization. Certainly an organization can’t completely change overnight, but consider these tips below to help establish incremental change within your organization:

Cost Reduction: Eliminate the need for business units to collect data that the risk, regulatory, and compliance functions have already gathered, and reduce duplication of data between risk regulatory, compliance, and customer intelligence systems. Avoid wasted marketing expenses by carefully targeting marketing campaigns based upon an improved understanding of customer needs and preferences.

Increased Operational Efficiency: Centralize management of customer data across the organization. Establish a single source of truth to improve data accuracy. Eliminate duplicate activities in the middle and back office, and free resources to work on other revenue generating and value-add activities.

Drive Revenue: Customize products based upon enhanced knowledge of each customer’s risk profile and risk appetite. Identify new customer segments and potential new products through better understanding of customer patterns, preferences, and behaviors. Enable a more complete view of the customer to pursue cross-sell and up-sell oppportunities.

3. Implement a thorough analytics solution that provides actionable insight from your data.

Today, it’s possible for financial organizations to implement an integrated Machine Learning component that runs in the background, that can ingest data of all types from any number of people, places, and platforms, intelligently normalize and restructure it so it is useful, run a dynamic series of actions based upon data type and whatever specified situations contexts your business process is in, and create dynamic BI data visualizations out-of-the-box.

Machine Learning Platforms like SYNAPTIK enable organizations to create wide and deep reporting, analytics and machine learning agents without being tied to expensive specific proprietary frameworks and templates, such as Tableau. SYNAPTIK allows for blending of internal and external data in order to produce new valuable insights. There’s no data modeling required to drop in 3rd party data sources, so it is even possible to create reporting and insight agents across data pools.

By Michael Davison

Categories
Insights

The Top 9 Things You Need to Know About BI Software

Learning more about the data your business collects is important to evaluating the decisions you make today, next year, and in the next decade; that’s called business intelligence. But while software can do that for you, figuring out what software you should use can be a perplexing process.

The first step to evaluating that software is to figure out the platform that you’ll be using—both its workflow and its platform. You’ll also need to establish your goals and objectives, and understand who needs to use that business intelligence. Any software like this will require training—both on its purchase and on its continued use. And your software should also provide solutions for security.

As you evaluate your software, you have to ask questions—and more questions. How much support will you get and what features are on your roadmap.

Want to work through the options? Use this handy graphic below for steps and concerns

by Michael Davison

Categories
Insights

Achieving Continuous Business Transformation Through Machine Learning

In the past, I’ve blogged about applying Agile methodology to businesses at large: what we are seeing in business today is that the concept of “business transformation” isn’t something that is undergone once or even periodically – business transformation is becoming a continuous process.

Today, businesses at large – not just their creative and development silos – benefit from operating in an Agile manner, most importantly in the area of responding to change over following a plan. Consider the words of Christa Carone, chief marketing officer for Xerox:

Where we are right now as an enterprise, we would actually say there is no start and stop because the market is changing, evolving so rapidly. We always have to be aligning our business model with those realities in the marketplace.”

This is an interesting development, but how, technically, can businesses achieve True Interaction in a continually transforming world?

Business Transformation: Reliance upon BI and Analytics

In order for an organization’s resources to quickly and accurately make decisions regarding a product, opportunity or sales channel, they must rely upon historic and extrapolated data, provided by their organization’s company’s Data Warehouse/Business Analytics group.

In a 2016 report by ZS Associates that interviewed 448 senior executives and officials across a myriad of industries, 70% of the respondents replied that sales and marketing analytics is already “very important” or “extremely important” to their business’ competitive advantage. Furthermore the report reveals that in just 2 years time, 79% of respondents expect this to be the case.

However, some very interesting numbers reveal cracks in the foundation: Only 12% of the same respondents could confirm that their organization’s BI efforts are able to stay abreast of the continually changing industry landscape. And only 2% believe business transformation in their company has had any “broad, positive impact.”

5 Reasons for lack of BI impact within an organization

1) Poor Data Integration across the business

Many Legacy BI systems include a suite (or a jumbled mess) of siloed applications and databases: There might be an app for Production Control, MRP, Shipping, Logistics, Order Control, for example, with corresponding databases for Finance, Marketing, Sales, Accounting, Management Reporting, and Human Resources – in all, creating a Byzantine knot of data hookups and plugins that are unique, perform a singular or limited set of functions, and are labor intensive to install, scale and upgrade.

2) Data collaboration isn’t happening enough between BI and Business Executives

Executives generally don’t appear to have a firm grasp of the pulse of their BI: only 41% of ZS Associates’ report participants thought that a collaborative relationship between professionals working directly with data analytics and those responsible for business performance exists at their company.

3) Popular Big Data Solutions are still siloed

Consider Ed Wrazen’s critique of Hadoop: During an interview at Computing’s recent Big Data and Analytics Summit, the Vice-President of Product Management at data quality firm Trillium Software revealed:

“My feeling is that Hadoop is in danger of making things worse for data quality. It may become a silo of silos, with siloed information loading into another silo which doesn’t match the data that’s used elsewhere. And there’s a lot more of it to contend with as well. You cannot pull that data out to clean it as it would take far too long and you’d need the same amount of disk storage again to put it on. It’s not cost-effective or scalable.”

4) Data Integration is still hard to do.

Only 44% of ZS Associates’ report ranked their organizations as “good” or “very good” at data aggregation and integration. 39% said that data integration and preparation were the biggest challenges within the organization, while 47% listed this as one of the areas in their organization where its improvement would produce the most benefit.

5) Only part of the organization’s resources can access BI

Back in the day, BI used to be the sole province of data experts in IT or information analyst specialists. Now companies are seeing the benefits of democratizing the access and analysis of data across the organization. Today, a data analyst could be a product manager, a line-of-business executive, or a sales director. In her book Successful Business Intelligence, Cindi Howson, author and instructor for The Data Warehousing Institute (TDWI) famously remarked:

“To be successful with BI, you need to be thinking about deploying it to 100% of your employees as well as beyond organizational boundaries to customers and suppliers… The future of business intelligence centers on making BI relevant for everyone, not only for information workers and internal employees, but also beyond corporate boundaries, to extend the reach of BI to customers and suppliers.”

Business leaders should examine these symptoms in the context of their own organizations.

Is there a solution to these issues?

True Interaction CEO O. Liam Wright has a novel approach to a new kind of BI solution. One that involves machine learning.

“In my 20 years in the business I’ve discovered several worlds in business that never spoke to each other properly, due to siloed information spaces, communications, platforms and people. Today’s fluid world necessitates changing the BI game completely: If you want to have a high level of true interaction between your systems, platforms, customers, internal multiple hierarchical department structures, then you NEED to flip the game around. SQL-based Dashboards are old news; they are so 2001.

You can’t start with a structured, SQL based situation that inevitably will require a lot of change over time – organizations don’t have the IT staff to continually support this kind of situation – it’s too expensive.”

Instead Liam took a different approach from the beginning:

I thought, what if we capitalized on the sheer quantity and quality of data today, and captured data in unstructured (or structured) formats, and put this data into a datastore that doesn’t care what type of data it is. Then – as opposed to expensive rigid, SQL-based joins on data types, instead we implement lightweight “builds” on top of the data. These lightweight builds enable businesses to start creating software experiences off of their base dataset pretty quickly. They also enable organizations to get Business Intelligence right out of the box as soon as they perform a build – dynamic dashboards and data visualizations which can be come much more sophisticated over time, as you pull in and cross-pollinate more data. Then, when the data is further under control, you can create software agents that can assist you in daily processes of data, or agents that get those controls out the door.

What is this BI and Machine Learning marriage, exactly?

So what exactly is Liam describing? Modern Business has just crossed over the threshold into an exciting new space – organizations can routinely implement an integrated Machine Learning component that runs in the background, that can ingest data of all types from any number of people, places, and platforms, intelligently normalize and restructure it so it is useful, run a dynamic series of actions based upon data type and whatever specified situations contexts your business process is in, and create dynamic BI data visualizations out-of-the-box.

True Interaction’s machine learning solution is called SYNAPTIK.

SYNAPTIK involves 3 basic concepts:

DATA: SYNAPTIK can pull in data from anywhere. Its user-friendly agent development framework can automate most data aggregation and normalization processes. It can be unstructured data, from commerce, web, broadcast, mobile, or social media. It can be audio, CRM, apps, or images. It also can pull in structured data for example, from SAP, Salesforce, Google, communications channels, publications, Excel sheets, or macros.

AGENT: A software agent is a program that acts for a user or other program in a relationship of agency, to act on one’s behalf. Agents can be configured to not only distribute data intelligence in flexible ways but also directly integrate and take action in other internal and external applications for quicker transformation to enhance your business processes and goals.

An Agent is composed of two parts: The operator and the controls. Think of an operator as the classic Telephone Operator in the 1940s that manually plugged in and unplugged your calls in the background. SYNAPTIK enables you to see how the operator works:

Operators can be written in several forms, such as Javascript, PHP / cURL, or Python. Organizations can write their own operators, or True Interaction’s development team can write them for you. An Agent also gives the user a control interface – a form field, or drag and drop functionality, in order to add specific assets, or run any variety of functions. In addition SYNAPTIK makes it easy to connect to a REST API, enabling developers to write their own own software on top of it.

BUILD: A build simply brings the DATA & AGENT components together, ultimately to enable you to better understand your organizations various activities within your data space.

A new model of BI: What is the Return?

– Machine Learning Platforms like SYNAPTIK enables organizations to create wide and deep reporting, analytics and machine learning agents without being tied to expensive specific proprietary frameworks and templates, such as Tableau. SYNAPTIK allows for blending of internal and external data in order to produce new valuable insights. There’s no data modeling required to drop in 3rd party data sources, so it is even possible to create reporting and insight agents across data pools.

– With traditional methods, the process of data normalization requires hours and hours of time – indeed the bulk of time is spent on this today. This leaves very little time for deep analysis and no time for deep insight. With SYNAPTIK, what takes 400 monthly hours to data manage now takes 2 minutes, opening up 399.99 hours of analysis, discovery, and innovation time to deliver the results you need.

– Not only is it possible create your own custom reports/analytic agents, SYNAPTIK enables organizations to share their reporting agents for others to use and modify.

– The inherent flexibility of SYNAPTIK enables businesses to continually provide new data products/services to their customers.

– Not so far down the line: Establishment of The Synaptik Marketplace, where you can participate, monetize, and generate additional revenue by allowing others to subscribe to your Agents and/or Data.

All of these returns contribute to not only augmenting organizational leadership and innovation throughout its hierarchy, but also producing incredibly valuable intelligence monetization, break-thru revenue, as well as improved “client stickiness” with the rollout of new data products and services. And, best of all, it puts businesses into a flexible data environment that does, quite elegantly, enable continuous transformation as industries, markets, and data landscapes continue to change.

We’ve got the Experience you Need

True Interaction has logged 100,000’s of hours of research, design, development, and deployment of consumer products and enterprise solutions. Our services directly impact a variety of industries and departments through our deep experience in producing critical business platforms.

We Can Integrate Anything with… Anything
Per the endless variable demands of each of our global clients, TI has seen it all, and has had to do it all. From legacy systems to open source, we can determine the most optimal means to achieve operational perfection, devising and implementing the right tech stack to fit your business. We routinely pull together disparate data sources, fuse together disconnected silos, and do exactly what it takes for you to operate with tight tolerances, making your business engine hum. Have 100+ platforms? No problem. Give us a Call.

by Michael Davison

Categories
Insights

Robot Advocates: When Consumer Products Advocate on the Customer’s Behalf

The Situation

Recently, True Interaction lead backend developer Darrik Mazey had experienced some troubling hardware issues – this time, it was outside of his hectic True Interaction project schedule.

His family’s 2-year old LG refrigerator went down, requiring multiple visits by the repairman to diagnose and fix, pulling him away from professional commitments, and generally making life annoying as he tried in vain to keep his family’s perishables from spoiling.

Leveraging Social Media, the ‘Old Fashioned’ Way

During this process, he did learn the benefits of leveraging social media to improve his customer service experience with LG.

When Darrik began tweeting about the fridge, LG offered a $50 food reimbursement, and extended his warranty. But this was just the beginning of further product woes, which eventually became the catalyst for an awesome solution.

Problems with the refrigerator persisted after the first “repair” job, and Darrik eventually had to rent another fridge while further diagnosis continued. In all, he estimated the cost due to missed work and equipment rental to be over $1500, no small amount for a hard-working family man.

Implementing a Real-time Solution with Xymon

Being the veteran coder that he is, Darrik resolved to ensure that, should the hardware fail again, he would be informed about it immediately via Xymon. In a nutshell, Xymon monitors hosts, network services, and anything else you might configure it to do via extensions. It can periodically generate requests to network services — http, ftp, smtp and so on — and record if the service is responding as expected.

In a recent personal blog post, Darrik recounts his refrigerator woes, and how he configured a controlbyweb temperature module, connected it to the fridge, networked it via wireless AP and monitored the fridge and freezer temperatures via Xymon 4.3.10.

Darrik monitors his Xymon network via its own Twitter account which conveniently tweets him with status updates, and / or if something is awry. Now Darrik receives tweets as soon as the fridge goes above 40 degrees.

Maximizing the Value of Social Media in the SPIME Era

Learning from his recent foray into using social media to improve his customer service experience, he’s now configured the system so that — should the LG fridge malfunction again — it will @LGUS, LG’s official Twitter account, with a status message, the current temperature inside the fridge, and a #fail hashtag.

We are living in an age where, with a little work and ingenuity, products can be configured to actually hold their parent company accountable — in public — regarding their quality, and the customer service surrounding them. We’re nearly touching upon what futurist Bruce Sterling (who recently shared his take on Google Glass) describes as the “spime era“. Sterling uses the term SPIME for connected, information rich objects that operate and communicate in an internet of things.

Since the onset of Friendster in 2002, social media has evolved from a form of informal communication, to the hottest marketing platform since TV advertisements, to a very viable channel for consumers to elicit and receive customer service. Certainly entrepreneurs and small business owners need to understand that their products and services live and die by consumer sentiment (robotic or otherwise) on social media. While this sounds like a precarious situation, the flipside is that with not too much effort, entrepreneurs and SMBs can preemptively mine social media regarding their products and services in order to unearth incredibly valuable intelligence. This includes discovering not only general consumer sentiment, but also other perspectives on their products and services they might never have considered, such as creative suggestions on improvement and unique new use cases, as well as using social media as a dragnet for discovering bugs, service gaps, or product flaws.

With that in mind, entrepreneurs and SMB owners should establish their social media intelligence network as soon and as thoroughly as possible in order to monitor relevant SM channels and get into the trenches to interface directly with their consumers. Utilize customers for the intelligence that they can provide, and reward them – even if it’s just recognition or a “thank you” – when they provide insight on the product or service, even if it’s not directly to the organization. When businesses step in, get involved and interact, and consumers discover that they are listening and care about their opinion, they’ve just converted a customer for the long-term.

The Future

So what’s next? What happens when consumers can make decisions about the products they choose to purchase, based upon products themselves sharing their real-time data on a social media channel? What happens when brands are judged by their ability to manage the self-communicating products they’ve birthed? And considering all this, where will manufacturer responsibility for their products end or extend in the near future? All of this is fascinating and mind-boggling stuff to ponder for a moment. I’d love to hear your thoughts.

by Michael Davison

Categories
Insights

Can Blockchain help Media’s Data Challenges?

There has been a lot of discussion around blockchain and its framework in the financial services world. However in our organization’s continuing research as well as operations with our clients we are beginning to uncover specific opportunities that span into the media space. We feel blockchain can and should serve as a universal and cost effective foundation to address data management and insight decision management – regardless of organizational size, output, or industry.

There have always been three distinct challenges in our collective experiences here at True Interaction:

1. Data aggregation and normalization:

With more and more linear and digital channel fragmentation, the process of aggregating and normalizing data for review and dissemination has weighed down efficient insight decision making. One way to look at it is: you spend 40-60% of your time aggregating and normalizing data from structured and unstructured data types (PDFs, emails, word docs, csv files, API feeds, video, text, images, etc.) across a multitude of sources (media partners, internal systems, external systems & feeds, etc.). This leaves little time for effective review and analysis – essential to determining any insight.

2. Data review, reporting, dashboards:

Once you have your data normalized from your various sources you can then move to building and producing your reporting dashboards for review and scenario modeling. This process is usually limited to answering questions you already know.

3. Insight and action

Actionable insight is usually limited, usually because the most time and resources are allocated to the above two steps. This process should entail a review of your data systems: are the providing you with insights outside the questions you already know? In addition, where and how can “rules” or actions into other processes and systems be automated? Today the are plenty of business situations that exist where manual coordination and communications are still needed or utilized in order to take action – thereby obviating any competitive edge to execute quickly.

Blockchain can provide an efficient and universal data layer to serve your business intelligence tool set(s). This can help consolidate internal data repositories when aggregating and normalizing data. It can also be the de-facto ledger for all data management activities as well as the database of record to ease and minimize management of propriety data platforms and processes.
The following are additional resources:

How will Blockchain will Transform Media and Entertainment

http://blog.www.true.design/how-blockchain-will-transform-media-and-entertainment

Blockchain technology: 9 benefits & 7 challenges

https://goo.gl/uJfh4R

Blockchain just fixed the Internet’s worst attribute

http://blog.www.true.design/blockchain-just-fixed-the-internets-worst-attribute

By Joe Sticca

Categories
Insights

Blockchain Just Fixed the Internet’s Worst Attribute

Blockchain was originally conceived in 2008 and implemented in 2009 as a way to record all public transactions in the Bitcoin database, but the technology has applications across nearly every industry. Today, we are seeing a huge amount of investment in blockchain technology, and some exciting movement across several industries regarding its implementation, and with good reason. At its essence, the wonder of blockchain comes down to this: finally, electronic transactions of all varieties can be made possible without the need for a bank.

Today I want to discuss and explore blockchain and how it can revolutionize attribution, provenance, and copyright –the Achilles’ heel of digital media. We’ll also take a look at some companies that are spearheading the technology leap.

We are at an amazing point in history for artists. A revolution is going to happen, and next year it’s going to take over. It’s the ability of artists to have the control and the say of what they do with their music at large. The answer to this is in the blockchain. ~Imogen Heap, Grammy Award-winning singer-songwriter

Here’s how blockchain will fix the internet with regards to digital media: In the same way as blockchain technology provides an open, decentralized ledger of monetary transactions, it can also be purposed as a self-verifying database of other types of time-stamped events – one such event could be, for example, the registration of a copyright. A blockchain may also attach a hash of the work itself, the metadata associated with it, and even any information about permitted uses of the piece of media. Following this idea, any new instance of the work – without that metadata – would not match the original record and therefore would obviously be identified as copy. “This can also help and empower artists and rights holders to manage and track the use of their work, as well as tie back into audience demographic data,” says True Interaction COO and resident digital media guru Joe Sticca.

Blockchain technology may well stabilize the value of digital media. Anyone who uses social media is intimately aware of the phenomenon of sharing something online, and watching it morph and be stripped of its metadata as it makes its way across pinboards, Tweets, and/or Reddit. Think of the ramifications of blockchain: Finally, an internet where, when you share something, it intrinsically contains the information regarding its creator as well as the contract for its use. And while this technology is purposed to squash piracy, its effects will fuel a whole new world of online commerce.

Here are a few companies that are leading the charge:

Blockai

Originally envisioned as a ‘Netscape for Bitcoin’ in 2015, Blockai intended to utilize blockchain for a kind of social media stream that would allow users to send messages and authenticate items. Now Blockai has announced it has raised $547,000 in seed funding to relaunch as a blockchain copyright service: a tool that allows artists to authenticate and claim copyright over images. With Blockai, the process is as follows: Create a piece of digital art, photo or anything that can be copyrighted. Register your copyright on the blockchain, a public ledger powered by Bitcoin. The record is permanent and immutable. Then, receive a registration certificate with cryptographic evidence that protects your copyright. You own the certificate forever.

Revelator

Revelator solves copyright challenges in the music industry by integrating sales, marketing, accounting and analytics into one unified system based upon blockchain technology, with fair pricing and levels of service for the individual artist, a manager, or a record label, Recently, the company announced it has raised $2.5 million in Series A funding led by Exigent Capital, with participation from Digital Currency Group and Israeli early-stage fund Reinvent (Revelator is headquartered in Tel Aviv). That’s on top of the $3 million the company has already raised.

Verisart

Verisart works in the same way, but in the realm of physical and fine art. The digital startup intends to chronicle original artworks, prints, multiples, and books by using block chain, thereby assigning each object a unassailable certificate of authenticity. Each object’s provenance, in turn, is created step-by-step by its owners, who add their e-mail addresses to an object’s data when they purchase one. Ultimately, Verisart plans to catalogue older artwork by extending the verification service to online sites and artists’ estates, and eventually begin to work with art appraisers and insurers to add works that have already been validated. The company has recieved $2M in funding from Earlybird Venture Capital, Freelands Ventures, and Digital Currency Group.

Monegraph

Monegraph specializes in attribution of media with multiple creators or complex attribution. The Monegraph platform enables sharing of revenue across the value chain of media distribution for online broadcasts, video clips, image reels, and other licensed or brand sponsored content. According to Monegraph, the ability to distribute media via their revenue sharing infrastructure aligns the interests of all stakeholders in the creation and promotion of the content in such a way that distribution is dramatically amplified. This allows the media to reach localized communities of viewers not obtainable through centralized points of distribution. Monegraph has raised $1.3M in funding for its platform.

Ascribe

Ascribe is another copyright platform that enables creators to claim authorship and generate a certificate of authenticity on a piece of media. More than 20 marketplaces and services have integrated the platform, and close to 4,000 artists are using the service. The company raised $2 million in seed from Earlybird Venture Capital, Freelands Ventures, Digital Currency Group, and various angels.They’re a developer-friendly company that includes plenty of documentation around their REST API. Ascribe also includes features such as setting limited editions for pieces of digital media, loaning or renting a piece of digital media (granting a temporary license for specific range of dates), and tracking the chain of ownership of a work of art.

By Michael Davison

Categories
Insights

How Blockchain will Transform Media and Entertainment

I recently blogged about blockchain where I described the technology and why it could be a revolutionary development, and also explored its various applications across a number of industries. Today, let’s take a closer look at blockchain in entertainment and media: its various applications, how it is transforming the industry, and its implications for the near future.

Quick Recap

A blockchain is type of data structure that can be purposed to create a digital ledger of transactions that can be shared among a distributed network of computers. By using cryptography, each account on the network may access and manipulate the ledger securely, without the need for any central authority or middleman – this is the supreme concept to take away. Blockchain is:

Reliable and available.
Because a wide circle of participants share a blockchain, it has no single point of failure and is designed to be resilient in the face of outages or attacks. If any node in a network of participants fails, the others will continue to operate, maintaining the information’s availability and reliability.

Transparent. Transactions on the blockchain are visible to its participants, increasing auditability and trust.

Immutable. It is nearly impossible to make changes to a blockchain without detection, increasing confidence in the information it carries and reducing the opportunities for fraud.

Irrevocable. It is possible to make transactions irrevocable, which can increase the accuracy of records and simplify back-office processes.

Digital. Almost any document or asset can be expressed in code and encapsulated or referenced by a ledger entry, meaning that blockchain technology has very broad applications, most as yet unimagined, much less implemented.

Given these characteristics, blockchain technology will likely be a catalyst for transformative innovation in nearly every industry.

Blockchain as purposed in Entertainment and Media

Payments

One of the most obvious applications of blockchain in the media is its ability to support micropayments that can be processed without the need for an intermediary payment network or its fees. Generally speaking, without blockchain, intermediary payment fees are too cost-prohibitive to enable micro-payments of values less than $1. Chris Dixon explained this wonderfully in an an article by venture capitalist Marc Andreessen:

“Let’s say you sell electronics online. Profit margins in those businesses are usually under 5 percent, which means conventional 2.5 percent payment fees consume half the margin. That’s money that could be reinvested in the business, passed back to consumers or taxed by the government. Of all of those choices, handing 2.5 percent to banks to move bits around the Internet is the worst possible choice. Another challenge merchants have with payments is accepting international payments. If you are wondering why your favorite product or service isn’t available in your country, the answer is often payments.”

Ostensibly very little would change from a consumer standpoint; consumers are used to purchasing songs for $.99. But behind the transactional curtain, everything would be transformed.

Without a middleman, when digital objects can be cryptographically associated with their creators, then there is no need for distribution channels. Think about that for a moment. This technology can put all kinds of media directly in the control and management of their creator, obviating the need for iTunes, Netflix, or Amazon. Or record labels. Or publishing houses. Blockchain has the potential to erode away the financial paradigms that conglomerate media companies have been using for the past century. Says Bruce Pon:

Imagine a future where creators upload their content to Facebook. There’s a “Buy” button on the bottom right corner. A consumer clicks it, and in a split second, the content is licensed to them, payment flows in the opposite direction to the creator, and the transaction is recorded on the blockchain.

Today, we are already seeing startups that are exploring new payment models through blockchain technology that are focused upon bringing more value to content creators. Ujo, in their own words, is a “home for artists that allows them to own and control their creative content and be paid directly for sharing their musical talents with the world.” The Ujo platform uses blockchain technology “to create a transparent and decentralized database of rights and rights owners, and automates royalty payments using smart contracts and cryptocurrency,” says Phil Barry, founding partner at Edmund Hart, which oversees Ujo. “We hope that it will be the foundation upon which a new more transparent, more efficient and more profitable music ecosystem can be built.”

Scarcity

Generally digital objects can lose value because they are easily copied. We see this especially in the area of pirated music, movies and TV. But because blockchain makes it possible for creators to register origin of work and set sharing permissions, structure the means of exchange that they’re willing to accept, it is possible create conditions for “digital scarcity”.

Consider a situation where an artist creates a piece of music in .mp3 format, and programs the ability for only 1000 people to listen to it, with the price of the .mp3 increasing with each new listener.

Relationship to consumer

Because blockchain precludes the need for a middleman, the technology creates new opportunities for large corporations to get closer to their customers and consumers. Because the playing field allows for individual creators to connect with their consumers directly, the onus on the bigger media companies is to operate more nimbly, and to offer more varied and interactive pricing models for their content based upon every individual consumer’s actions and purchases. And because provenance, payment, and distribution becomes simpler and less expensive to manage with blockchain, bigger media companies can concentrate more on creating quality content itself.

Reflection

While we haven’t seen it just yet, blockchain technology promises to transform a myriad of industries – especially media and entertainment. To the media consumer, it likely means more access to more content, from more creators, on a much more personal, secure, and granular level. For content creators, it means a much simpler attribution, payment, and distribution system, and the ability to be creative with payment models and the concept of “digital scarcity”. It will be exciting to see what happens.

If you are an SMB owner and want to learn more about blockchain, check out my first post on the subject. And by all means, get in touch with an expert. The time is now to begin exploring implementation of blockchain technology in your business. Consider these statistics:

– A billion dollars in venture capital has flowed to more than 120 blockchain-related startups, with half that amount invested in the last 12 months.1
– Thirty of the world’s largest banks have joined a consortium to design and build blockchain solutions.2
– Nasdaq is piloting a blockchain-powered private market exchange. 3
– Microsoft has launched cloud-based blockchain-as-a-service. 4

We’ve got the Experience you Need
True Interaction has logged 100,000’s of hours of research, design, development, and deployment of consumer products and enterprise solutions. Our services directly impact a variety of industries and departments through our deep experience in producing critical business platforms.

We Can Integrate Anything with… Anything

Per the endless variable demands of each of our global clients, TI has seen it all, and has had to do it all. From legacy systems to open source, we can determine the most optimal means to achieve operational perfection, devising and implementing the right tech stack to fit your business. We routinely pull together disparate data sources, fuse together disconnected silos, and do exactly what it takes for you to operate with tight tolerances, making your business engine hum. Have 100+ platforms? No problem. Give us a Call.

By Michael Davison

Categories
Insights

Forget Your Development Team – is Your Business Agile?

Situation

In my last article, I interviewed business strategy consultant Michael Farmer of Farmer & Company regarding his new book, Madison Avenue Manslaughter, which details the plight of advertising agencies and their deteriorating situation today brought about by several paradigm shifts, including the shift from commissions to fees, brand globalization, the rise of holding companies, client obsession with shareholder value, and the digital and Internet revolutions. In the interview, I touched upon a quote by Albert Einstein:

We can’t solve problems by using the same kind of thinking we used when we created them.

While Einstein’s words most definitely apply to the trend in advertising agencies as detailed in Mr. Farmers book, let’s put away the magnifying glass, pull back for a moment, and explore business at large.

First of all, the average lifespan of an S&P 500 company has decreased from 61 years in 1958 to 27 years in 1980, to just 18 years now, and that number is diminishing as I write this. On average, an S&P 500 company is now being replaced about once every two weeks. And the churn rate of companies has been accelerating over time.

Comparing 1955 Fortune 500 companies to 2015 Fortune 500 (ranked by total revenues), there are only 61 companies that appear on both lists. Nearly 88% of the companies from 1955 have either gone bankrupt, merged with (or were acquired by) another firm, or they still exist but have fallen from the top 500. In other words, only 12.2% of the Fortune 500 companies in 1955 were still on the list 60 years later in 2015. Most of the 1955 companies on the list are unrecognizable today: Armstrong Rubber, Cone Mills, Hines Lumber, Pacific Vegetable Oil, and Riegel Textile. Today, successful companies need to explore new products, markets, and business models more frequently in order to continuously renew their advantage. According to BCG Perspectives,

“…companies face circumstances that change more rapidly and unpredictably than ever before because of technological advances and other factors. As a result, companies need to constantly renew their advantage, increasing the speed at which they shift resources among products and business units. Second, market share is no longer a direct predictor of sustained performance.”


Source

Defined by reduced time between innovation and adoption, increased market unpredictability, and reduced importance of market share, our modern business era has unveiled new drivers of competitive advantage – one of the most important being: the ability to adapt to changing circumstances or to shape them. This echoes Disney CEO Bob Iger’s famous quote: “The riskiest thing we can do is just maintain the status quo.” In a recent study of more than 900 business leaders, 93% responded that they “have completed, are planning, or are in the midst of a business transformation”. Really, what we are seeing is that “business transformation” isn’t something that is undergone once or even periodically – business transformation is becoming a continuous process.

Indeed today, businesses at large – not just their creative and development silos – benefit from operating in an Agile manner, most importantly in the area of responding to change over following a plan. Consider the words of Christa Carone, chief marketing officer for Xerox:

“Where we are right now as an enterprise, we would actually say there is no start and stop because the market is changing, evolving so rapidly. We always have to be aligning our business model with those realities in the marketplace.”

Solution

The situation in business today inevitably begs the question: “Where will your business be in 20 or even 10 years? Statistically, 9 of 10 people who are reading this are working for an organization that will NOT stand the test of time. But the good news that I’ve blogged about in the past is that progressive businesses that take the technology leap and invest in the future will reap tremendous gains over their less progressive peers. With that in mind, ALL SMBs should take the time to reassess the value of their business processes and technology solutions as soon as possible.

Need help determining the right solution? Consider these 9 criteria:

1. How easy and intuitive is the user interface?

– Affordance Visually, the UI has clues that indicate what it is going to do. Users don’t have to experiment or deduce the interaction. The affordances are based on real-world experiences or standard UI conventions.

– Expectation Functionally, the UI delivers the expected, predictable results, with no surprises. Users don’t have to experiment or deduce the effect. The expectations are based on labels, real-world experiences, or standard UI conventions.

-Efficiency The UI enables users to perform an action with a minimum amount of effort. If the intention is clear, the UI delivers the expected results the first time so that users don’t have to repeat the action (perhaps with variations) to get what they want.

-Responsiveness The UI gives clear, immediate feedback to indicate that the action is happening, and was either successful or unsuccessful.

-Forgiveness If users make a mistake, either the right thing happens anyway or they can fix or undo the action with ease.

-Explorability Users can navigate throughout the UI without fear of penalty or unintended consequences, or of getting lost.
No frustration emotionally, users are satisfied with the interaction

2. How quickly and easily can the solution be implemented?

Does the solution offer an accelerated implementation approach to minimize demands on your resources? Rapid implementation techniques can reduce costs by more than 50 percent – again, this takes us back to the subject of Agile methodology.

3. How easily can the solution integrate with your supply chain, product development, and business processes?

No system operates in a vacuum, and it delivers the most value when embedded in the business! Your solution should have multiple points of integration, so that all business processes are outfitted with historical data in order to discern insights and take action.

4. Can the solution easily scale as your business grows?

Change. The only thing that remains constant. Take into account not only number of users, but also specific roles and functions and the need to support end-to-end business processes, which are constantly changing.

5. Is the business solution available as SaaS or subscription?

You can’t always anticipate your future, so being fiscally conservative is important. On-demand business solutions are often available on a subscription basis, virtually eliminating the traditional upfront investments. Alternately, if cash is a major issue, your solution provider should offer you some flexibility in billing, payment, intellectual property, and ownership – allowing you to keep your cash working while you get the benefits of the newest business technology solutions.

6. Does the solution offer you company-wide visibility into your business processes?

Link up with a solution provider who understands business process management. The right solution can help you gain a HUGE competitive advantage through increased visibility into critical business functions, superior reporting, integrated processes, and even increased customer loyalty/retention, more in-depth customer insights, and an accelerated product time to market.

7. Are there ample resources to assist you with your implementation and ongoing support?

Look for business partners with both long-term business experience and support services, as well as expertise with cross-functional, strategic, technology and software solutions.

8. Is industry-specific expertise built into the product?

The best business solutions are not plain vanilla.Your solution provider should understand your industry as well as you, and address any industry-specific needs, support roles, and functions unique to your vertical markets.

9. Does it provide you with any real-time monitoring and analytics?

by Michael Davison

Categories
Insights

Madison Avenue Manslaughter : An Interview With Author Michael Farmer

The rise-and-conquer story of the advertising industry after the end of World War II has become woven into the fabric of modern American folklore: ads and commercials from the Golden Age of advertising (1945-1975) are forever etched in Baby Boomers’ memories, while the industry’s Mad Men themselves have been celebrated and further mythologized in our entertainment. The ad agency exec archetype, with his swagger and his 3-martini lunch, is one of the most familiar characters in American culture, while those actual Mad Men of the Golden Age, who pounded their concepts of “Big Ideas”, “Creativity”, and “Unlimited Service” to their clients, established such a mark upon advertising agency culture that it pervades the industry to this day, and remains the template for today’s advertising.

The problem with this, according to Michael Farmer, Chairman of Farmer & Company LLC, a strategy consulting firm for advertising agencies and advertisers, is that the industry has been turned completely on its head since the Golden Age, and the paradigms that were then in place then cannot address the state of the industry today. Peril is close at hand:

Today’s Mad Men celebrate new clients and creative awards just like the Mad Men of yesteryear, with champagne, parties and laudatory speeches, but the resemblance and the fun stop there. Returning to their daily routines, ad agency people put on a brave face, struggle with increasing workloads and demanding clients, and feel like players on a losing team, unable to break out or at least pull even with their clients as respected, secure partners. The advertising business, which was once one of the most fulfilling and glamorous of industries has become a grim sweatshop for the people who do the work.

The system is broken, says Mr. Farmer, and the ad industry is in dire straits. His riveting new book Madison Avenue Manslaughter recounts the “dizzying heights” of the Mad Men days, and tracks a timeline of the key events and technologies – such as remuneration changes, globalization, new ownership, shareholder value, and digital and social media – which brought about the weakening health of today’s advertising agencies, and are now typified by ever-growing and unaccountable workloads, reduced client fees, and shortened or one-off client engagements.

With a richly depicted history and a candid, thorough examination of the current state of advertising agencies, Madison Avenue Manslaughter lays out a detailed 10-step transformation program for those progressive industry CEOs who want to “restore organizational health, financial well being and renewed strategic relevance for their ad agencies”.

I recently had a short conversation with Michael Farmer, where we discussed Madison Avenue Manslaughter and mused about the future of the advertising industry.

Michael, first let me commend you on your book. As an advertising industry outsider, the setup of your argument– the comprehensive history and explanation of the current state of affairs– was so richly detailed, it felt like a page-turner. I learned quite a lot; the theme of your book brought to my mind a quote by Albert Einstein that I think is quite applicable to what you are describing:

We can’t solve problems by using the same kind of thinking we used when we created them.

How does this resonate in your mind with regards to what you describe in your book?

It’s hard to argue with Einstein! Yet, the mystery of the ad agency business is that executives are wedded to the concepts that created success in the period 1960 through 1980 — even though the conditions that allowed this past success do not exist. For example, agencies still believe that “highly creative TV ads drive client brand sales.” Well, that was true when TV was a novelty, as it was in the ‘60’s and ‘70’s, and amusing ads were a new thing….but today? TV ads are no longer a novelty, and we’re familiar with all the cliches and attempts to amuse. We’ve each digested several hundred thousand ads since that day, and we’re sick of them! Pure creativity is not the formula for success. Furthermore, agencies are paid 1/3rd what they used to be paid, so they can’t afford “full service.” Let’s face it, the world has changed, but they’re stuck in the past.

Agency remuneration in the Golden Age was commissions-based. Can you briefly describe the shift to today’s model, and what effect this has on workload?

Agencies then received 15% of their clients’ spend on media — for TV, radio and print. That covered ad creation. How much work they did was irrelevant to how much they were paid. In the ‘90s, though, most of the industry was required to change to “fee-based” remuneration, which means they are paid for the number and cost of people who work on a client’s account, plus some additional money for overhead and profit. This should correlate with the amount of work they do, but in fact it does not! Clients and agencies agree on fees and agency headcounts / fees, but nowhere in the system do they clearly spell out how much work is to be done and how many people it actually requires to get it done. This is a holdover from the “full service” days when remuneration was based on commissions. The new system, then, allows clients to grow the workloads but hold the agency fees and resources (people) constant, and that’s what happens. Workloads grow, but fees and resources are driven downwards. More work, fewer people. A complete disaster, and it continues every day!

You make an observation in your book that even as SOW communication happens after-the-fact, creative workloads skyrocket, unmeasured and completely independent of agency resources or fees, the typical agency C-level indeed does NOT want to know about or address this issue. Can you elaborate on the managerial passivity that pervades the industry, and why that is the case?

The passivity is irresponsible, in my view. Agency CEOs are not doing enough to ensure that their agencies are paid for all the work they do. They are reluctant to throw their weight behind “SOW tracking systems” that would be updated regularly by their senior client heads, and they absolutely are uninterested in reviewing client head performance — finding out who is giving away work and who is not. I can’t understand this, but it appears that they don’t really want to manage their organizations. They want to win new business and be viewed as creative geniuses, but they have little appetite for the hard work of management.

Your consultancy has built a database of Scope of Work (SOW) briefs, and has established a metric for measuring workload across them: the ScopeMetric(R) Unit, or SMU. Can you briefly explain the metric, how your organization uses it, and why it’s important?

Early in my consulting career with agencies, I found that I needed three things to understand agency operations: 1) the amount of work they were doing for each client; 2) their fees by client; and 3) the resources they allocated to each client. This is simply logical: “what are you doing for each client; how much are they paying you; how many people does it take.” Across an agency office of 20 clients, there would surely be “good clients” and “bad clients,” where the alignment among workload, fees and resources was out of whack. I needed to identify those situations. In order to do so, I had to figure out how to measure workload. Today, there’s a huge difference among the relative sizes of a TV ad, a print ad, a Tweet, and an online ad banner. I decided to use creative manhours as my basic measurement, using them to measure the size of different deliverables, categorized by media type (i.e., TV), media detail (TV:30), origination versus adaptation, and according to creative complexity (low, average, high).

I now have a database of about 7,000 deliverables, each with a unique SMU value based on creative manhours. The use of an SMU allows me to calculate “price” (fees divided by SMU workloads), “productivity” (SMUs per creative person per year) and other metrics.

Advertising agencies do not have a system in place for measuring workloads. Do you know of any turnkey “workload management platforms”, SOW measurement and management tools, or other solutions on the market today? Are there any early adopters?

Advertisers, on their own, have used systems like Decideware to keep track of agency deliverables, but even Decideware does not have a way of measuring the amount of work in the SOWs, We may team up with them to combine Farmer metrics with their system. Agencies, though, are resisters.

So what can be done? What is the ‘next best step’ that agency C-level management can take, right now?

If I were an agency CEO today, my first step would be to announce a policy. It would sound something like this:

Every client that we serve will have its SOW documented in a uniform way, using an agency-wide SOW tracking system using Farmer’s SMU metrics. Every client head will ensure that his / her SOWs are kept up to date in the tracking system.. Every quarter, we will review client performance, client-by-client, examining the alignment of client workloads, fees and agency resources. Clients whose workloads, fees and resources are misaligned in some way — like ‘too much work and too little fee’ — will require corrective action by client heads. We will review client head performance in correcting misalignments. Needless to say, it is imperative for our agency to be paid correctly for all the work we carry out, and for the resources required to carry out the work.

Michael, your book is currently for sale, and provides readers with a detailed cross-section of the operations and financials of a model agency, as well as a 10-step transformation program for CEOs. Are there any other resources, online or otherwise that you would recommend to your audience?

I write a blog from time to time, and it is published on http://farmerandco.com. The blog is a place where I can comment on developments in the industry associated with the under management of SOWs and agency remuneration. I try to make this an interesting resource — let me know how well I’m doing!

Madison Avenue Manslaughter is the winner of the 2016 Axiom Awards Gold Medal for Marketing Books, and is available online and in selected bookstores nationwide.

By Michael Davison

Categories
Insights

The Changing Terrain of Media in the Digital Space

The rapid digitization of the media industry does not merely address the immediate needs posed by the market, but also anticipates the constantly changing consumer behavior and rising expectations of an increasingly digital customer. The World Economic Forum points to a growing middle class, urbanization, the advent of tech savvy millennials demanding instantaneous access to content on a variety of platforms, and an aging world population that is invariably accompanied by the need for services designed for an older audience as the most pronounced demographic factors that are currently contributing to the reshaping of the media landscape. The expanding list of accommodations that customers are coming to expect from the media industry more or less fall within the realms of the accessibility and personalization of content.

The Path to Digital Transformation

Average weekday newspaper circulation has been on a steady decline, falling another 7% in 2015 according to the most recent Pew Research Center report. This inevitable dwindling of interest in print publications could be ascribed to the rising demand for media companies to adopt a multi-channel strategy that enables the audience to access content across different platforms. Companies remedy their absence of a formidable digital presence in a variety of ways. One of the most common resolutions that companies have resorted to involve redesigning their business model by bundling print subscriptions with mobile device access, a measure enacted to address the 78% of consumers who view news content on mobile browsers. A more radical approach could be opting for a complete digital transformation, a decision reached by The Independent earlier this year when it became the “first national newspaper title to move to a digital-only future.” The appeal of having information become readily available on any screen of the customer’s choosing is magnified by the expectation of uniformity and equally accessible and engaging user interfaces across all devices. Of course, convenience to the customer does not only rely on their ability to access content on the platform of their choice, but also at any point they desire, hence the focus on establishing quick response times and flexibility of content availability.

Another expectation that consumers have come to harbor aside from unhindered access to content: the minimization, if not the complete elimination of superfluous information. According to the 2016 Digital News Report by the Reuters Institute, news organizations, such as the BBC and the New York Times, are striving to provide more personalized news on their websites and applications. In some cases, people are offered information and clips on topics in which they have indicated an interest. Additionally, companies are also employing a means of developing “auto-generated recommendations based in part on the content they have used in the past.” Transcending written material, streaming platforms like Pandora and Netflix utilize Big Data in order to analyze and discern the characteristics and qualities of an individual’s preferences, thus feeding information into a database that then determines content using predictive analytics that the individual would be predisposed to enjoying. In previous blog posts, we have divulged the value of understanding Big Data, emphasizing how execution based on the insight gleaned from Big Data could be as crucial to a company’s profitability as the insight itself. As evidenced by this growing practice of collecting consumer data in order to cultivate personalized content for consumers, it is obvious that the media industry has not been remiss in its observation of the discernible success that data-driven companies boast relative to competitors that are not as reliant on data. Finally, perhaps as equally satisfying as being able to browse through personalized, recommended content based on one’s past likes and preferences is the exclusion of repetitive content, as informed by one’s viewing history.

Media companies embrace their ascent into digital space in a plethora of ways. Some elect for a complete digital transformation, conducting a substantial part if not all of their business within browsers and applications rather than in print. There are also those that focus on enhancing the customer experience by maintaining contact with consumers through all touch points and following them from device to device, all the while gathering data to be used in optimizing the content provided. Another means through which media companies are realizing their full digital potential is through the digitizing of their processes and operations. These businesses are initiating a shift towards digital products; a decision that is both cost-effective (cutting costs up to 90% on information-intensive processes) and can bolster the efficacy of one’s data mining efforts. Warner Bros was one of the first in the industry to transform the ways of storing and sharing content into a singled, totally integrated digital operation that began with Media Asset Retrieval System (MARS). This innovative digital asset management system ushered in a transformation that effectively lowered Warner Bros’ distribution and management costs by 85%.

A Glimpse into the Future

So what’s next in this journey to digital conversion? According to the International News Media Association (INMA), all roads lead to the Internet of Things (IoT). By 2018, the Business Insider Intelligence asserts that more than 18 billion devices will be connected to the Web. The progression into this new era of tech where information can be harvested from the physical world itself will not go unobserved by the media industry. Media companies are tasked with having to evolve beyond the screen.

Mitch Joel, President of Mirium Agency, writes:

“Transient media moments does not equal a strong and profound place to deliver an advertising message… the past century may have been about maximizing space and repetition to drive brand awareness, but the next half century could well be about advertising taking on a smaller position in the expanding marketing sphere as brands create loyalty not through impressions but by creating tools, applications, physical devices, true utility, and more robust loyalty extensions that makes them more valuable in a consumer’s life.”

Big Data anchors the efforts into the Digital Age and the IoT will provide new, vital networks of information to fortify this crusade.
Contact our team to learn more about how True Interaction can develop game-changing platforms that cut waste and redundancy as well as boost margins for your media company.

By Justin Barbaro

Categories
Insights

Neural Networks: What They Are, and Their Many Applications

It’s behind the Tesla autopilot feature. It’s your recommendations from Netflix. It’s when Siri recognizes your speech and serves you results. It’s the foundation for your credit card’s fraud detection technology. We see the application of neural networks and machine learning all around us today in nearly every aspect of life.

With the exponentially increasing volumes and varieties of data, the advent of cheaper and faster computational processing, and ubiquitous affordable mass data storage, neural networks aren’t just for Google and Microsoft anymore. It’s important for small and medium business owners to know what neural networks are, what they can do for their business, and also what their limitations are.

So What is a Neural Network?

Let’s grab a definition from Dr. Robert Hecht-Nielsen, an early pioneer in neural networks in the 1980s and 1990s. He defines an artificial neural network (ANN) as:

“…a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs.”

An ANN mimics certain features of the brain’s physical structure and information processing, with a web of neural connections that consist of myriad interconnected and layered simple processing elements. Akin to its biological sibling, in an ANN:

1. Each processing element (essentially a neuron) receives inputs from other elements
2. The inputs are weighted and added,
3. The result is then transformed (by a transfer function) into the output.

The transfer function may be a step, a sigmoid function (S-curve), or hyperbolic tangent function, among others.

ANNs are basic learning devices in the form of hardware or software. In both cases, the fundamental idea is to assemble several single simple processors that interact through a dense web of interconnections, which result in a network architecture that is unlike the sequential linear processing and architecture of conventional computer systems.

How do Neural Networks Differ from Conventional Computers?

Conventional computers are good at numerical computation; they apply formulas, decision rules, and algorithms instructed by users to produce outputs from the inputs. A neural network, on the other hand, is not a general-purpose problem solver. It is good at complex numerical computation for the purposes of solving system of linear or non-linear equations, organizing data into equivalent classes, and adapting the solution model to environmental changes. But it is not good at such mundane tasks as calculating payroll, balancing checks, and generating invoices. Nor is it good at logical inference – a job suited for expert systems. Therefore, business leaders must know when a problem could be solved with an ANN; moreover, to make an ANN work, it must be tailored specifically to the problem it is intended to solve.

ANNs, like people, learn by example. They improve their own rules; the more decisions they make, the better the decisions may become. Data scientist and entrepreneur Jeremy Howard describes this phenomenon quite colorfully:

The difference here is each thing builds on each other thing. The data and the computational capability are increasing exponentially, and the more data you give these deep-learning networks and the more computational capability you give them, the better the result becomes because the results of previous machine-learning exercises can be fed back into the algorithms. That means each layer becomes a foundation for the next layer of machine learning, and the whole thing scales in a multiplicative way every year. There’s no reason to believe that has a limit.

An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Think of it simply as a branch of statistics, designed for a world of big data, where the most common application of machine learning is to make predictions.

Applying Neural Networks to Different Industries

Because a neural network must be built and tailored specifically to the problem it is intended to solve, you can’t just slap on a machine learning solution someone else did for their own context and set of data. The best way to determine if you can leverage neural networks in your own business and then reap the gains achieved by them is to learn and understand how neural networks intersect and function across a breadth of different industries; this will inform your own specific situation. I’ve shared several examples for you below.

Marketing

In marketing, we identify customers likely to respond positively to a product or service, and target any advertising or solicitation towards them. Target marketing involves market segmentation, where we divide the market into distinct groups of customers with different consumer behavior. Neural networks are well-equipped to carry this out by segmenting customers according to basic characteristics including demographics, socio-economic status, geographic location, purchase patterns, and attitude towards a product.

Unsupervised neural networks can be used to automatically group and segment customers based on the similarity of their characteristics, while supervised neural networks can be trained to learn the boundaries between customer segments based on a group of customers with known segment labels, for example: frequent buyer, occasional buyer, rare buyer. Machine learning can save your organization both time and money by ensuring that you avoid contacting customers who are unlikely to respond. One study showed that neural networks can be used to improve response rates from the typical one to two percent, up to 95%, simply by choosing which customers to send direct marketing mail advertisements to. Neural networks can also be used to monitor customer behavior patterns over time, and to learn to detect when a customer is about to switch to a competitor.

Retail & Sales

Neural networks are excellent in the realm of sales forecasting, due to their ability to simultaneously consider multiple variables such as market demand for a product, a customer’s disposable income, population size, product price, and the price of complementary products. Forecasting of sales in supermarkets and wholesale suppliers has been shown to outperform traditional statistical techniques like regression, as well as human experts.

Another important area where retail and sales can benefit from neural networks is in shopping cart analysis, such as gathering and inputting information relating to which products are often purchased together, or the expected time delay between sales of two products.

Retailers can use this information to make decisions about the layout of the store: if shopping cart analysis reveals a strong association between products A and B then they can entice consumers to buy product B by placing it near product A on the shelves. If there is a relationship between two products over time, say within 6 months of buying a printer the customer returns to buy a new cartridge, then retailers can use this information to contact the customer, decreasing the chance that the customer will purchase the product from a competitor.

Banking & Finance

One of the main areas of banking and finance that has been affected by neural networks is trading and financial forecasting. Neural networks have been applied successfully to problems like derivative securities pricing and hedging, futures price forecasting, exchange rate forecasting and stock performance and selection prediction since the 1990s.

But there are many other areas of banking and finance that have been improved through the use of neural networks. For many years, banks have used credit scoring techniques to determine which loan applicants they should lend money to. Traditionally, statistical techniques have driven the software. These days, however, neural networks are the underlying technique driving the decision making. Credit scoring systems can learn to correctly identify good or poor credit risks. Neural networks have also been successful in learning to predict corporate bankruptcy.

Insurance

The insurance industry can leverage neural networks in a similar means as the marketing industry: policy holders can be segmented into groups based upon their behaviors, which can help to determine effective premium pricing. And like the banking and finance sectors, the insurance industry is constantly aware of the need to detect fraud – neural networks can be trained to learn to detect fraudulent claims or unusual circumstances. Competition is fierce in the insurance industry, and when a policy holder leaves, useful information can be determined from their history which might indicate why they have left. Using machine learning to manage the offering certain customers incentives to stay, like reducing their premiums, or providing no-claims bonuses, can help to retain good customers.

Telecommunications

Machine learning offers telecommunications organizations a clear opportunity to ascertain a much more complete picture of their operations and their customers, as well as to further their innovation efforts. Some companies are using a series of neural networks to analyze customer and call data in order to predict if, when, and why a customer is likely to leave for another competitor. Many telecommunications organizations use machine learning to help predict the effects of forthcoming promotional strategies, as well as sift through and refine data to find the most profitable customers.

Other uses of neural networks in telecommunications include:

– Optimizing routing and quality of service by analyzing network traffic in real time

– Analyzing call data records in real time to identify fraudulent behavior immediately

– Allowing call center reps to flexibly and profitably modify subscriber calling plans immediately

– Tailoring marketing campaigns to individual customers using location-based and social networking technologies

-Using insights into customer behavior and usage to develop new products and services

Operations management

Neural networks have been used successfully in operations management, particularly in the areas of scheduling and planning. R&D regarding the scheduling of machinery, assembly lines, and cellular manufacturing using neural networks has been increasingly prevalent over fifteen years. Other scheduling problems, like timetabling, project scheduling, and multiprocessor task scheduling have also been addressed with neural networks.

The use of neural networks in various operations planning and control activities cover a broad spectrum of application, from demand forecasting, to shop floor scheduling and control. Neural networks have also been used in conjunction with simulation modeling to learn better manufacturing system design.

Operations management also benefit from neural networks in the area of quality control, as neural networks can be integrated with traditional statistical control techniques to enhance their performance. Examples of this include a neural network used to monitor soda bottles to make sure each bottle is filled and capped properly.

Neural networks can also be used in diagnostics, and have been used to detect faults in electrical equipment and satellite communication networks. Project management tasks have also been tackled by using neural networks to forecast project completion times for knowledge work projects, or to predict workloads and delivery times in software engineering and development projects.

Conclusion

It’s apparent that the application of neural network technology is having disruptive effects and is becoming more and more pervasive in common business operations – even across the SMB market – with every passing day. My advice for business leaders is to take the time to thoroughly learn and understand what the technolgy implies, so you may begin to identify use cases and scenerios within your own business ecosphere. Research and identify the possible opportunities or insights that may be gleaned by plugging into Big Data and/or employing data vendor systems. And by all means, seek the consultation of an expert – one that seeks to highlight and understand the key critical functions of your business, that identifies what parts and/or interaction points you need to improve, and then helps you articulate a realistic, cohesive plan to develop a scalable solution.

Sources

Chartiere, Tim. Big Data: How Data Analytics Is Transforming the World.
Lee, Eldon. Artificial neural networks and their business applications.
Smith, Kate and Jatinder, Gupta. Neural networks in business: techniques and applications for the operations researcher.

By Michael Davison