Transform & Innovate Legacy Technology

Resources ($) spent treading water

Transforming legacy technology remains a difficult proposition. A lack of historical expertise by current stakeholders is one impediment, as is transparency of soft costs within the technology budget. The full expenses of legacy infrastructure can remain hidden until confronted with a significant technology outage, or the sudden increase in the cost of maintenance. The average costs and resources of maintaining legacy systems and application averages approximately 30% of an organization’s IT budget, which include:

  • Maintenance
  • Talent (Human capital)
  • Internal and external compliance (i.e. GDRP, etc.)
  • Risk; Security threats and trends
  • Agility, scalability and stability

One of the most important factors in dealing with transforming legacy technology is facing the reality of your organization’s culture and alignment.

Where to start…

Begin by drawing a line around your monolithic systems, code and infrastructure. Often companies believe they must reengineer all their legacy systems from the ground up. This is a common error, however, it is critical to delineate which portions of the system can be inoculated or identified as ‘core’ and then build API or structures around that “core”.

Develop an integration strategy and then construct an integration layer. This means some code will be written from the foundational or infrastructure level, then to the database layer and finally to the user experience environment. It is critical to identify those systems which can be detethered and then “frozen”. This facilitates a phased integration approach, upon which additional functionality can be layered. Depending on the complexity of legacy architecture, these changes may be cost prohibitive, and so the ability to isolate, freeze and use a layered-build approach is an appropriate solution. This will permit an organization to stabilize their applications code and then build APIs or other integration layers around ‘frozen’ areas of the technology stack. In some circumstances, Block-chain can be very useful in providing a fast and simple way to place in an integration layer within the  legacy or ‘frozen’ environments.

Missing Link

The most important component of transformation and innovation is the people within the organization, not the technology or skillsets around the technology.  Industry studies indicate a potential for 20-30% increase in productivity and creative thought if the individuals are engaged and aligned with the organization’s goals, and the change processes align with individual goals and performance.  All departments and stakeholders must be in alignment from product, QA, development, and infrastructure to the end users. This is the most important aspect of any technology transformation initiative;  creating a safe and collaborative environment to facilitate “creative dissent’.

Understanding Shoppers Using Big Data

The emergence of big data has taken many industries by storm. The consumer packaged goods (CPG) industry has been at the forefront of big data technology. The success of any CPG company is dependent on the ability to take advantage of changing consumer trends better than the competition. Below are ways in which CPG companies are using big data and analytics in order to drive growth and understand their customers.

Grocery Store Decisions
While shoppers walk through grocery aisle’s looking for the product that best meets their needs, many people are unaware of the logistical and strategic methods behind shelving products in the grocery store. The ability to analyze and leverage massive datasets is important for CPG companies when they’re competing for shelf-space and customer awareness. Retailers and distributors gather consumer insights and measure actual point-of-sale (POS) data for strategic decision making related to production, distribution, and promotion.

Marketing Strategies
Staying up to date with point of sale data allows CPG companies to reconcile inventory after each transaction and adjust inventory numbers based on high or low sales. Additionally, actual point of sale data gives a company an opportunity to make a general customer profile. Valuable information such as names, addresses, phone numbers, past purchases and order history are recorded to make a personalized product recommendation based on a customer profile.

Social Media
The growth of social media and digital technology has given consumers a larger platform to express their opinions on brands. This has helped CPG brands to continually evolve and change based on direct feedback from their customers. The gathering of information from social media platforms, known as social intelligence, can help CPG companies understand their different customer segments. Sentiment analysis tools can be used by extracting data from Twitter, Facebook, Pinterest, and Instagram to get an in depth understanding of what customers care about.

Practical Value
In 2012, L’Oréal, used social media analytics to understand their brand perception in the digital world. By mining and filtering social media data, L’Oréal analysts were able to track their customer “Voice of Beauty” program. This data was highly useful as L’Oréal USA was able to use consumer insights to engage with their customers. Social media analytics used by L’Oréal enabled the company to develop a stronger bond with their customers and uncover customer sentiments in real time to improve the product development process.

With Synaptik’s social listening tools, companies have the advantage to track conversations around specific phrases, words, or brands. Sign up for a 30 minute consultation and we can show you what customers are saying about your products and services across multiple social media channels online (Facebook, Twitter, LinkedIn, Pinterest).

Cloud Computing enters the Madness!

The NCAA Division 1 Men’s Basketball Tournament is unlike any other event in sports. Widely known as March Madness, the 68 team single-elimination tournament has become one of the most famous annual sporting events in the United States. The unpredictability and excitement of each game draws millions of viewers each year. Experts and fans are amazed by the improbable victories of smaller schools over traditional powerhouses (2018 marked the first year a number 16 seed defeated a number 1 seed). However, advances in modern computing have uncovered new patterns and insights to make informed predictions on games.

In Comes Google!
At the end 2017, The NCAA announced Google Cloud as its official cloud provider. Additionally, the NCAA agreed to migrate 80 years’ worth of competition data across to the Google Cloud. This partnership recognized Google Cloud as the official cloud sponsor of the NCAA Tournament and provides analytics and machine learning tools for interested fans and bracket makers. These platforms are highly efficient and will analyze NCAA competition data that will allow fans to search, compare, and analyze team and player statistics.

Practical Value
Traditional basketball statistics (Three-point percentage, free-throw percentage, point differential, etc.) are normally used to determine outcomes and the strengths of individual teams. However, these statistics fail to incorporate historical trends that could be crucial to win your bracket pool. Google Cloud’s data integration uncovered new insights that go beyond the boxscore such as:
• Teams wearing blue have the most Final Four appearances
• PAC-12 teams are best at winning tight games.
• Teams with cat mascots have caused the most upsets.

Machine Learning
The first round of the NCAA tournament is very unpredictable as only 164 of the 18.8 million brackets on ESPN.com were perfect after round one in 2017. A competition hosted by the NCAA and Google Cloud challenged participants to build and train machine learning models to forecast the outcome of games. While still new to the market, machine learning models have predicted these games with relative success. Data scientist’s at MarketWatch used statistics from all the first round teams from 2001-2017 to develop a machine learning algorithm. When they tested the algorithms with the 2017 first round data, it had a 75% success rate.

Summary
While machine learning and data analytics are not immune to error, the ability to analyze and identify patterns in large datasets provides a unique opportunity for businesses and individuals to uncover new and valuable insights.

Synaptik, True Interaction’s Data Management and Machine Learning platform makes it easy to connect with, or build classical statistical models, custom machine learning models, algorithms, and even neural nets to boost your workflow with automated analytics. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

Data Integrity: How to Keep It

Data is central to your business. So how can you ensure that your small or medium sized business is maintaining its data in an accurate and consistent manner? Doing so is called data integrity. Data integrity, a central part of information security, involves storing and managing data in a precise and dependable way. Maintaining data integrity is common practice in large and enterprise organizations. But what can your small or medium sized business do to ensure data integrity despite limited access to time and resources?

1.) Establish data entry protocols
Maintaining data integrity begins at data entry. Creating data entry protocols aligned with business processes can serve as the first line of protection from unclean, inaccurate data. But protocols are just the beginning. Ensuring the on boarding and continuous development of staff will need to be a constant priority. In addition, protocols will require updates when systems or demands from systems change.

2.) Set rules for data validation
Errors are bound to occur and compile despite the most robust of data entry protocols. However, even the most minor errors can threaten data integrity. In response, database administrators should use validation rules to control data entered into company databases. For example, administrators can restrict permissions of certain individuals from altering entered data. This provides an added layer of quality assurance and security when maintaining data integrity.

3.) Automate data cleaning
Maintaining clean data is essential to ensure data integrity. As mentioned, processes to keep data clean may involve establishing rules for data input and scheduling data health checkups. However, these measures are not impervious to generating unclean data. One option is to select and set up a tool that automates data cleaning processes. Doing so reduces time and energy spent by team members ensuring data integrity.

In short, the data integrity of your small or medium-sized business would substantially increase following a systematic approach to data entry, validation, and cleaning. Fortunately, there are a number of platforms available that can help automate some or all of these processes for your business for a reasonable fee.

This article is about maintaining data integrity. True Interaction built SYNAPTIK, our Business Process Automation Platform, specifically to make manage core business processes like data cleaning to enhance the data integrity of digital systems. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

How to Make Best-of-Breed Software Even Better

Selecting an optimal best-of-breed system represents one of the most critical decisions for any company looking to purchase new software. A best-of-breed system is one that best meets the need of a specific business function. This might be Salesforce (Customer Relationship Management), Quickbooks (Accounting) and ADP (human resources).

Choosing a best-of-breed software solution can have a number of advantages:

1. Getting the best product for the job

Best-of-breed systems will address business needs in the most complete way. For instance, Salesforce provides solutions for most of the business challenges faced by salespeople. When it falls short, applications that sync with Salesforce can further tailor to any needed experience.

2. Working with specialized vendors
Each vendor will likely understand your business pain points better. Quickbooks, for instance, provides financial management solutions to tens of millions of small businesses. Their support teams are likely well equipped to answer questions regarding specific business challenges because they have seen them before.

3. Receiving continuous updates
Most best-of-breed systems need to continuously grow to meet customer demands. That’s how they maintain their best-of-breed status! Current updates to improve software will likely lead to better experiences for customers.

However, best-of-breed systems also have a number of disadvantages:

1. Increasing complexity

Using Salesforce, Quickbooks and ADP for your business? Get ready to manage the complexity of multiple systems and multiple vendors. Doing so can require a tremendous amount of time and energy from your team.

2. Training
Managing complexity is one thing; learning how to get the most out of all these systems is another. Many best-of-breed systems require users to upskill and keep up their understanding of individual systems. Training becomes more complex if you work in an industry with high turnover.

3. Data Integrity
This drawback is critical. With multiple systems, you must manage multiple sources of data. How is this data stored? Can it be organized in a way that produces actionable insights for a given team? Answering these questions are critical before selecting new best-of-breed software solutions.

One potential method of improving the data integrity from multiple best-of-breed systems is to integrate all the data in one safe, well-organized location such as a data management platform.
True Interaction built SYNAPTIK, our Business Process Automation Platform, specifically to make manage core business processes like data cleaning to enhance the data integrity of digital systems. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design

Buy the Numbers? Private Equity and Hedge Funds

Private equity firms and hedge funds should be thriving in the age of information. With thousands of alternative data options, these companies can select which ones are most valuable for their business. However, many private equity firms and hedge funds do not rely on quantitative methods for assessing investment viability. According to the Financial Times, only 62% of hedge funds are investing in machine learning while only 54% are investing in big data initiatives. Should the almost 40% of hedge funds seek potential use cases for either, below are some examples:

Tracking industries
Tracking specific industries represents a major task for people within hedge funds and private equity firms. Hedge funds may have one person or a desk in charge of monitoring and forecasting specific items within an industry. Similarly, private equity funds may specialize in purchasing companies in specific industries such as retail or food service.

Tracking companies pre-purchase
The critical work of private equity funds is researching, bidding on and purchasing companies for their portfolio. To a lesser extent, hedge fund analysts want to monitor the success of certain companies prior to investment or divestment

Track sales
Tracking company sales can signal the health of specific companies. Hedge fund analysts will want to know the sales numbers and projections of the products under their watch. These numbers will likely include cyclical time series data. Perhaps more importantly, private equity firms who have already purchased companies need to monitor how their adjustments (if any) have affected total revenue.

In summary, digitization use cases abound for hedge fund and private equity companies. These use cases include:
1.) Data ingest for millions of data points on companies and industries
2.) Analytics dashboards with visualization capabilities
3.) Data science analysis of industry and/or company data
4.) Low-code apps with user-friendly graphic user interfaces (GUIs) and backends storing proprietary algorithms

Synaptik, our data management and machine learning platform, is currently providing data ingest and management services to a billion-dollar hedge fund. Learn more about our process by contacting us at www.synaptik.co

Analytics for Batteries? How Machine Learning Can Transform the Modern Energy Grid

Individuals who follow the energy industry are aware of the decreasing market share of peakers. All plays on words aside, peakers are natural gas-fired power plants that provide energy grids with extra surges of electricity during peak usage hours. Today, lithium ion batteries are emerging as a growing source of competition in this 1.1-billion-dollar industry. These batteries can ease the usage of natural-gas power plants by storing energy during the day (i.e. from solar panels) for usage at another time.

However, lithium ion battery facilities are still up to 35% more expensive to manage and operate that natural gas peakers. Added costs involve battery construction, maintenance, and disposal. Technological advances will likely make a dent in these added costs over the next decade. Yet there are ways for lithium ion battery providers to begin reducing costs today in order to make them more price-competitive with peaker plants. These use cases involve:

1.) Predicting spikes and modeling unpredictability
Today, the power grid relies on lithium ion batteries primarily to provide short bursts of electricity. These short bursts stabilize the voltage and frequency of the grid, and last only several seconds at a time. In other words, batteries focus more on short-term duties in the current electrical grid. Lithium ion battery companies can better control their costs regarding usage and design if they are able to better predict spikes and uncertainty in order to deliver the grid the power it needs at specific times.

2.) Predicting lengths of peak.
Lithium Ion batteries will need to provide more than short bursts of electricity in the future in order for them to encroach further on peaker market share. In time, analysts will need to understand how lithium ion batteries can provide more sustained levels of energy over several hours. This will allow lithium ion batteries to serve as more of a replacement to peakers, and less of a stop gap measure.

3.) Modeling battery lifecycle
As stated earlier, lithium ion batteries are up to 35% more expensive than natural-gas peakers for use by energy grids. Batteries may become even more expensive if the federal government repeals its 30% investment tax credit for these facilities. The ability for battery facilities to monitor real-time battery expenditure to the second and store these millions of data points for the use case of optimization analysis could lead to the engineering of batteries better equipped to meet the needs of the energy grid at a reduced cost.

Synaptik, True Interaction’s data management and machine learning platform, is currently exploring ways to better serve the energy industry. To learn more, please schedule a conversation at www.synaptik.co

How Law Firms Can Leverage Federated Searches

Lawyers and law firms must rely on the clearest information possible in order to best serve their clients. Federated searches represent one way that firms can become more adept at meeting this goal. These searches allow users to analyze hundreds and thousands of data points about individuals and companies by pulling information from select database APIs (i.e. The Department of Justice, the Securities and Exchange Commission (SEC)) within seconds. However, not all lawyers and law firms would require such powerful tools. Instead, they should consider employing a federated search product if:

1.) If A Quick Google Search Is Not Comprehensive Enough
Google searches are a powerful tool. In many cases, Google search is enough to provide lawyers with information about individuals that are not central to a case. Google searches are likely used to investigate all individuals of interest, but these searches may not be all that is needed to understand the situational context of the case.

2.) If a Private Investigator (P.I.) Is Not Needed
Private investigators generate a significant amount of business from lawyers and law firms. They represent the ultimate background check. Private investigators scour through personal records and documents for the person of interest. In addition, they will often trail persons of interest for several days and observe their behavior. For most cases, a P.I. will only be asked to research one or two persons of interest. This is because hiring a P.I. is expensive for firms, and only necessary in specific situations.

3.) If A Firm or Lawyer Can Afford It
Federated Search capabilities come at a price tag. A federated search can cost several thousand dollars a month for creation and maintenance depending on the number of agents. Large and medium sized firms (> 35 lawyers) have enough staff and generate enough revenue to consider whether this type of service fits their needs. However, smaller firms and individual shops would probably not have enough revenue or need to justify the cost of federated search capabilities.

So what is the optimal use case for lawyers and law firms utilizing federated search capabilities? In short, when medium-to-large law firms need information on individuals greater than the specificity provided by a Google search but not as comprehensive as a private investigator.

Synaptik, True Interaction’s Data Management and Machine Learning Platform, helped one organization save $34M a year using is federated search capabilities. To learn more, please contact us at www.synaptik.co

The Two Ways Big Data Helps Developing Nations

Big data tools have become a national conversation in across U.S. enterprise as businesses try to leverage their existing data into a competitive advantage. Companies in industrialized nations have used big data tools for complex processes like consumer personalization that create a unique experience for each individual. While big data tools create advantageous situations for businesses, they can also provide cost effective solutions for governance in developing nations. Although Big Data will not be able to create resource parity with wealthier countries, developing nations can harness the power and potential of Big Data to alleviate issues caused by challenges facing health care and tourism.

1. Improvement in health care
Healthcare is a difficult commodity to receive due to cost and geographical locations even in industrialized countries like the United States. Traditional healthcare data includes vital statistics and hospital administration statistics. But with advances in technology, healthcare providers can see medical records, mobile phone and purchase records, GPS, social media, and more. The increase in mobile phones usage among developing nations has presented an opportunity to improve the delivery of healthcare. India’s personal identification programme is an example of big data technology tools in healthcare. In 2010 the government in India issued cards and identification numbers to all of its citizens. The cards, identification numbers and biometric information gave opportunities to monitor health and social data including, electronic medical records and information on health insurance for low-income families. While this was an ambitious project for a developing country like India, the opportunity for success provided a foundation for collecting health statistics.

2. Improvement in tourism
In our previous post, we highlighted how tourism is changing in the information age. For developing nations, tourism can help generate revenue that can be transferred via taxes into essential services. Any improvements in the tourism industry could be highly beneficial for the overall economy of a developing nation. In Mexico, BBVA Bancomer, BBVA Data & Analytics, and SECTUR worked together in order to analyze digital footprint data of visitors in Mexico. Some of the findings of this study are listed below:
– National tourists used their credit cards for trips while the international tourists used their cards for entertainment
– The highest percentage of spending is generated by U.S. tourists, followed by visitors from Argentina.
– In Cancun and Playa del Carmen, tourist spending was concentrated on Fridays and Saturdays and was more stable during the week
Statistics provided by-https://www.bbva.com/en/bbva-shows-big-data-can-boost-tourism-mexico/

These statistics are only a small portion of the general findings from BBVA Data & Analytics. Businesses and companies in Mexico could use this data to their benefit and design promotions for busier periods of the week (Fridays and Saturdays). Additionally, they are able to anticipate when international tourists are more likely to visit islands or other tourist destinations in Mexico. These type of metrics are highly useful as they can provide new insights that were previously unavailable for businesses in developing countries.

At the current pace of technological advancement, big data tools will continue to increase in importance. Improvements in healthcare and tourism are two of the many ways in which big data technology can be used to benefit developing nations. Big data tools are still relatively new to the industrialized world, therefore leveraging this technology will be a necessary action for developing nations in order to compete in a global marketplace.

Sources:

Tena, M. (2016, December 02). BBVA shows how Big Data can boost tourism in Mexico | BBVA. Retrieved January 29, 2018, from https://www.bbva.com/en/bbva-shows-big-data-can-boost-tourism-mexico/

Wyber, R., Vaillancourt, S., Perry, W., Mannava, P., Celi, L. A., & Folaranmi, T. (2015, January 30). Big data in global health: improving health in low- and middle-income countries. Retrieved January 30, 2018, from http://www.who.int/bulletin/volumes/93/3/14-139022/en/

United States Government . (n.d.). The official U.S. government site for Medicare. Retrieved January 30, 2018, from https://www.medicare.gov/

Tourism in the Information Age

Big data and analytics have become increasingly relevant in recent years. Simply put, Big Data is a term that describes the large volume of data-both structured and unstructured- that business collects on a daily basis. How people and organizations use this large volume of data is becoming more and more important in the modern marketplace. For the tourism industry, Big Data has huge potential as it can track information on human activity and preferences that will benefit companies and their customers.

Industry Outlook

The tourism industry is a major contributor to the U.S economy. In 2016, the U.S. travel and tourism industry generated over $1.5 trillion in economic output and supported over 7.6 million U.S. jobs. Additionally, according to the U.S. Department of Commerce, international travel to the United States should grow by 3% annually through 2021. This demonstrates how tourism benefits the U.S. economy on a national and local level as restaurants, hotels, ride services and many other businesses generate significant revenue from tourism.

Big data and technological trends in tourism

1. Personalization
While many people associate a personalized consumer experience with tech giants like Netflix and Amazon, companies in the tourism industry should not be deterred from creating a more personalized experience. Making travel arrangements (flights, lodging, activities, etc.) can be a time-consuming process. For many working men and women, time is a valuable resource and a personalized experience would simplify a customer’s ability to find the right travel package.

2. Mobile Friendly Platforms
Across the world, smartphone usage is rising and future metrics indicate that this trend will only continue. Companies are developing their marketing strategies and platforms in order to reach consumers on their mobile devices. Mobile big data can be utilized by companies in the tourism industry to create personalized marketing campaigns. An increase in mobile internet traffic will force companies to develop a user-friendly platform for customers interested in traveling.

According to Criterio, airline and mobile bookings made up 27% of all online travel bookings worldwide. Additionally, smartphone bookings have seen a significant increase with 33% more bookings year-over-year during the second quarter in 2016. This represents a shift in consumer behaviors as travelers are rarely separated from their smartphones

3. Affordable New Destinations
Before the emergence of the internet and big data technology, travelers mainly explored historic and noteworthy sites like New York City, Los Angeles, Walt Disney World, etc. However, with big data analysis and other technological tools, travel companies have the ability to understand the travel habits and travel patterns of modern consumers. Knowing this type of information can be very beneficial as companies can respond with new offerings and packages that cater to the needs of individual consumers.

Big data and technological tools have been adopted in many industries as consumer data increases in importance. In a highly competitive industry like tourism, leveraging the right customer analytics to gain a competitive advantage will be crucial for the for success of any firm.

Keep the Change! What ICOs Can Do for Your Business (Part 2)

Initial Coin Offerings (ICOs), a business fundraising event similar to an initial public offering in which investors purchase company-issued tokens with digital currencies (i.e. Bitcoin) for equity, are increasing in frequency as cryptocurrency become more widely accepted across the globe. (Please see Part 1 of this blog post for a more detailed explanation of the most common forms of ICO tokens). This new system of fundraising holds as much promise as peril for companies seeking to use different ICO methods to attract investors. Below describes the prevailing issues concerning each:

Security tokens: Security tokens may seem like an ideal way for a company to sell stock to outside investors, but they come with one major catch. As described in Part I, security tokens allow for companies to directly sell stock in their companies to investors. The challenge here is that many companies seeking to launch security tokens as investment vehicles do so as an attempt to skirt regulations that traditionally oversee these types of transactions. Ideally, companies seeking to launch these would first seek regulatory approval in order for security token offerings to be deemed legal,, yet this process may appear too timely and costly for many startups looking to security tokens to boost cash reserves. Companies seeking to use security tokens should make sure that they can demonstrate compliance with the Howey Test, the result of a 1934 Supreme Court case that established the four characteristics of an investment contract transaction:
1.) An investment of money
2.) Expectation of profits from the investment
3.) The investment of money is in a common enterprise
4.) Any profit comes from the efforts of a promoter or party

So how can companies sell stock in their company without investors expecting profits (point 2 above) in the form of dividends? That’s a great question. Until it is answered, companies seeking to use security coins need to be particularly careful of regulatory agencies such as the Securities and Exchange Commission that have already warned companies that ICOs attempting to circumvent securities laws may be prosecuted under federal law.

Equity Tokens: Equity tokens, a category of security tokens, are still subject to the regulations detailed above. However, equity tokens could become a robust method for companies to raise early-stage funding by democratizing the opportunity to invest, should a company receive adequate legal approval to use an equity token. However, one of the current limitations of equity tokens is timing. In order to succeed, equity tokens will need to rely on blockchain-enabled smart contracts between the equity issuer (the company) and the equity purchaser (the investor). Smart contracts are legal agreements between two parties that are stored for transparency and posterity on a decentralized blockchain using cryptography code. The challenge here for democratization involves context: if only the most sophisticated contemporary investors understand complex transactions involving the blockchain and smart contracts, will they actually empower the “everyman” investors to participate? Only time will tell, but the not-too-far-off future looks bright for companies looking to legally leverage equity tokens to raise early stage capital if they are able to comply with federal laws.

Utility Tokens: With utility tokens, the possibilities are endless. As described in Part 1, a consortium of major banks are planning to use utility tokens to improve interbank transactions. Other utility coin examples include:
Brave, a web browser company, has a Basic Attention Token (BAT) that provides advertisers with utility tokens based on user attention.
Kik, a messaging service, has Kin, a token provided to developers responsible for a transactions their service provided on the platform
Filecoin, a decentralized storage network, allows network members to earn Filecoins for hosting files on their unused Hard Drive space, which can then be exchanged for used for USD, Bitcoin and other crypto currencies.

One major future challenge with utility tokens involves whether or not they can maintain a competitive advantage for their issuers. For instance, if Amazon monitored Filecoin’s success and decided they would like to replicate their model while providing higher value for their network members at scale, would Filecoin be able to leverage network member engagement with its token to prevent token holders from leaving the network? Industry behemoths have become notorious for replicating competitors products (i.e. Facebook with Snapchat) and it remains highly likely that behemoths may follow suit with successful companies reliant on utility coins to retain network members.

By Justin Barbaro

Keep the Change! What ICOs Can Do for Your Business (Part 1)

EDITOR’S NOTE: This article is about how companies are using Initial Coin Offerings (ICOs) in transformative ways to raise funds and gain a competitive edge. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation DMP, specifically to make it easy for leaders to collect and manage data to get to insights faster. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

Bitcoin, Bitcoin, Bitcoin. The daily price of Bitcoin, the leading exchange cryptocurrency, is a topic of national conversation in the media. One of the more significant aspects of Bitcoin from a social perspective is that it likely represents how most people around the world first learn about cryptocurrencies.

Today, individuals and companies leverage digital assets such as cryptocurrencies and tokens in complex and comprehensive ways. One such method uses digital assets in the form of tokens to represent value within a network. Such tokens can represent stock holdings and other features, and can be used to generate investment throughout the lifecycle of a company.

One innovative new use of crytocurrencies are initial coin offerings (ICOs). Put simply, ICOs are a business fundraising event similar to an initial public offering in which investors purchase company-issued tokens with digital currencies (i.e. Bitcoin) for equity stake in the organization or its member network. ICOs represent an intriguing and innovative new route for companies to secure the capital they need to grow and thrive – but not without complication or drawbacks.

This blog post contains two parts. Part I below outlines the three most common forms of crypto tokens used for ICOs (security tokens, equity tokens and utility tokens) and how they are currently used in practice as part of an offering. Part II, coming in January, will further explore the implications of each in the contemporary social and legal landscape.

Security Token: Defining a security is prerequisite for a security token discussion. Securities are any kind of tradable asset, including stocks, bonds, mutual funds, etc. Therefore, security tokens are tradable assets for those who hold them, capable of providing dividends in multiple forms. As would be expected, companies using security tokens to garner investment would need to do so in accordance with the rules of the appropriate governmental authorities, for instance the U.S. Securities and Exchange Commission. The ramifications of choosing whether or not to do so will be discussed in Part II of this blog post.

Equity Token: Equity tokens are a category of security token that may be particularly useful for early-stage companies looking to raise funds. In their most basic form, equity tokens allow companies to sell stock in their companies through coin purchases. What makes equity tokens more intriguing than traditional investment measures is how the blockchain transforms equity ownership for investors. The presence of a blockchain allows for transparency of ownership as well as corporate voting, which can also be captured and stored in more transparent ways. As a category of security token, equity tokens must be issued in accordance the rules of the appropriate governmental authorities to ensure that they are legally executed. Part II of this post will also explore other requirements for the success of an equity coin.

Utility Token: Utility tokens differ from securities-backed tokens because they do not rely on the expectation of profit. Instead, utility tokens provide users with access to a useful service or product. For example, six major banks including UBS, HSBC and Credit Suisse have developed their own utility coin planning to launch in 2018 that represents interbank payments, thereby reducing the need to rely on the sluggishness of traditional money transfers or brokers. Utility tokens can come in an endless variety of flavors and are not subject to governmental regulation so long as companies can explicitly demonstrate that the tokens are not securities-backed. However, utility tokens are not without their own limitations. Some of these limitations will be covered in Part II.

Initial Coin Offerings (ICOs) are a nascent form of business fundraising that will likely grow in popularity across the coming years. Stay tuned to Part II that will discuss some of the strengths and limitations of each category of token offering.

References

1.) https://www.learnvest.com/knowledge-center/investing-101-what-is-a-security/
2.) https://www.wsj.com/articles/sec-chief-fires-warning-shot-against-coin-offerings-1510247148
3.) http://strategiccoin.com/3-types-ico-tokens/

by Justin Barbaro

The New Food Chain: How Blockchain Will Transform the Food Industry

EDITOR’S NOTE: This article is about how blockchain is helping the food industry to better serve consumers around the globe. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation DMP, specifically to make it easy for leaders to collect and manage data, for instance from blockchain databases, to get to insights faster. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

For now, Bitcoin is getting all the headlines. Yet the versatility of the blockchain will likely be the disruption to remember.

Initially constructed as a technology underlying Bitcoin, a cryptocurrency, the blockchain is a decentralized ledger that records transactions. What makes the blockchain a disruptive innovation is that the transactions are almost impossible to edit or manipulate after the fact because they are encrypted and provide alerts when any part of the chain is altered. The potential applications of the blockchain are still being discovered given the many ways that different industries conduct transactions.

The food industry is one area in which the blockchain already enjoys an immediate impact. Below are three ways in which the blockchain will improve operations across this critical industry:

Supply Chain: The blockchain is currently enhancing supply chain management across the food industry. Food industry giants such as Walmart, Kroger and Tyson Foods have begun automating their supply chains by tracking key information including the temperature, quality and shipping dates of certain perishable and non-perishable goods. Blockchain providers such as IBM are already looking to partner with similar enterprises so that these metrics are stored on an un-editable ledger to ensure fidelity of the supply chain from producer to consumer. This ledger will also be able to serve as a transparent database, allowing food industry companies to leverage analytics to better understand their supply chain bottlenecks, efficiencies and areas for transformation.

Food Safety: The blockchain used to monitor supply chain transactions also has the potential to dramatically improve food safety, a serious issue in much of the developing world. A late 2015 report from the World Health Organization (WHO) estimates that every year 1 in 10 people fall ill from eating contaminated foods. The effects of food safety challenges are particularly acute in the young, with over 125,000 children estimated to die annually from unsafe foods. Blockchain stands to reduce these unfortunate and preventble incidents in three ways: 1.) by providing consumers with transparency that the foods they are eating match the ingredients on the label; 2.) by capturing any event in which the food may be tampered with at any point in the supply chain, and 3.) by enabling retailers to pull potentially hazardous foodstuffs from shelves given any incident.

Payments: Blockchain stands to transform payments in the food industry. Food producers, many of whom sell their items at commodity rates, would be able to demonstrate proof of sale instantly using blockchain technology. Similarly, food distributors would be able to make payments to producers with greater ease and trust. Blockchain technology also has the potential to cut out middlemen and lower transaction fees, another promising development for small- or medium-sized food producers.

References

1.) https://theconversation.com/how-blockchain-technology-could-transform-the-food-industry-89348
2.) http://www.who.int/mediacentre/news/releases/2015/foodborne-disease-estimates/en/
3.) http://fortune.com/2017/08/22/walmart-blockchain-ibm-food-nestle-unilever-tyson-dole/
4.) https://www.anxintl.com/blog/2017/11/27/the-next-industry-to-be-revolutionised-by-blockchain-the-food-industry

by Justin Barbaro

So, what exactly is a DMP?

EDITOR’S NOTE: This article is about how data management platforms (DMPs) can assist decision-makers in organizing their data in ways leading to strategic insights. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation DMP, specifically to make it easy for leaders to collect and manage data to get to insights faster. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

A data management platform, or DMP, imports, stores and compiles customer or target audience data from various sources and makes it actionable. It can ingest, sanitize, sort and format data. Most importantly, it can analyze and segment this data and present it in a visual format that is easily understood and made use of by executive decision makers.

In the AdTech/Martech world, there’s a common misconception that DMPs are somehow exclusive to the digital advertising ecosystem where DMPs produce audience segments that are syndicated to external ad targeting and content delivery platforms and compared across channels. The reality is that everyone from toy stores to hedge funds and even government agencies are employing DMPs for internal data management. Finance firms may use a DMP for data forensics. Retail giants are increasingly employing DMPs as 1:1 data engines that personalize the e-commerce experience with recommendation engines and displays based upon rich user profiles. Enterprise organizations and SMBs alike utilize DMPs for non-advertising/marketing tasks such as aggregation of scraped and purchased data sets, business intelligence, product management and inventory. In fact, DMPs utilizing AI have been replacing traditional supply chain management departments at a rapid pace in 2017.

Both B2B and B2C organizations leverage a DMP to understand customer audiences based upon conversion, engagement and purchase rates and to target audiences with personalized and therefore more effective messaging. A typical B2B use case involves the matching and correlation of 1st party data with 3rd party data for look alike modeling which provides channel clarity and really enables business to build profiles at a company level that groups together individuals associated with or employed by organizations that are sales targets.

The first thing an organization that is considering purchasing a DMP should do is establish what the process and flow will look like for the intake of data from multiple sources stored in various locations. Run a test with a month’s worth of data. See what kind of issues you encounter getting your data ingested and normalized. Concurrent to establishing a process for input, you want to understand your own business goals and ultimately identify what is of value to your team’s mission. This could be new registrations and cohort starts and the VPA Value Per Acquisition or CLV Customer Lifetime Value. An organization could simply want to understand what the impact of their newly refurbished, responsive website is on their e-commerce and what platform, device or browser are their most avid customers using. Prioritize the top 4 or 5 data points and make sure these are stood up as part of the initial integration.

In the digital advertising industry, it can be a challenge to differentiate between a DMP and a demand-side platform (DSP) as the lines are continually blurred. Some DSPs have begun to offer DMP functionality to inform their ad purchasing and to avoid lag and integration problems that typically result from today’s fragmented martech/adtech stack. Some have morphed into SSPs that integrate with multiple DMPs while simultaneously offering their own DMP service. Bottom line though: DMPs store and process data, sort it and provide context and insight. Beyond that, they don’t function as an exchange or DSP. A DMP does not coordinate programmatic ad campaigns for you.

A DMP can be used to map different SourceIDs and cookie IDs across the ecosystem. This is major problem that needs resolution in that an industry standard does not exist. So, you have Ad networks, mobile exchanges, middle man measurement tools, data management platforms, fraud vendors, SSPs, agency trading desks and DSPs all using various IDs to track transactions. Attribution can get quite complicated.

A good DMP can cleanse and process structured and unstructured data alike and generate visual analytics for the data from multiple departments, programs and campaigns. Ideally, the data becomes actionable and decisions become validated, justified and quantified by the insights produced. Data is compartmentalized and segments may be produced. As we approach 2018, I can’t imagine recommending a DMP that is not 100% cloud based, as it needs to scale. Similarly, it should possess an intelligent layer of machine learning. A good DMP offers their users the option of either API stream or S3 data bucket upload, whichever is preferred by the customer.

Clearly the point is to manage one’s data but also to merge it and make sense of it. Ultimately, a DMP should enable the monetization of an organization’s data. A good DMP will create one holistic view of all data within an organization. Synaptik is a DMP that is flexible enough to address you strategic data needs across a number of organizational functions: Finance, Analytics, IT, Marketing, Operations and Customer Relation Management, among others. Synaptik’s advanced intelligent layer can even draw correlations between the different data. While most businesses are overwhelmed by the sheer volume of data that they are failing to leverage, others may be intimidated by the thought of purchasing a DMP because they don’t think they have the capacity, or the technical DNA in house to take this kind of thing on. Well, a DMP is supposed to minimize both labor and angst and should come with frictionless on-boarding and attentive support for rule mapping and customization. The DMP staff should be falling over themselves to meet your terms. Pointing customers to a rabbit hole of self help technical article links and leaving it up to the customer to figure out how best to get things up and running is not acceptable. The DMP you select should be intuitive enough for you to figure out how to configure it on your own once it’s been deployed. Lastly, a good DMP should feel be agnostic and customizable.

At True Interaction, we pride ourselves in our Digital Transformation Services along with our Data Intelligence acumen. Please schedule a time to have a discovery conversation today.

by David Sheihan Hunter Lindez

Three Digital Marketing Innovations for Higher Education

Three Digital Marketing Innovations for Higher Education

EDITOR’S NOTE: This article is about how higher education marketing teams can transform their current strategies using digital transformation. True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage data, including marketing data, for more meaningful insights. For more information or a demo, please visit us at https://synaptik.co/ or email us at hello@www.true.design.

Continue reading