SEO = No Longer Enough

EDITOR’S NOTE:This article is about how to approach and think about Search Engine Optimization (SEO). True Interaction built SYNAPTIK, our Data Management, Analytics, and Data Science Simulation Platform, specifically to make it easy to collect and manage core and alternative data for more meaningful data discovery. For more information or a demo, please visit us at or email us at

So you have built an amazing product and an equally amazing website to match. What can you do to stand out and get found online amongst fierce competition? You need to leverage every tool within your reach – in particular ones that are of low or minimal cost. Search Engine Optimization (SEO) is one of the best tools to make sure your potential clients find your site, and new advances in Machine Learning technology hold the power to take SEO strategy to the next level.

A clear understanding of Machine Learning, Search Engine Optimization (SEO) and Content Marketing are necessary before discussing how the former can positively impact the latter. For the purpose of this post:

Machine Learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it learn for themselves.1

Search Engine Optimization (SEO) is a methodology of strategies, techniques and tactics used to increase the number of visitors to a website by obtaining a high-ranking placement in the search results page of a search engine (SERP) including Google, Bing and other search engines. Using specific keywords on landing pages is one of the best-known tactics to increase the amount of site visitors.2

Content Marketing: a type of marketing that involves the creation and sharing of online material (such as videos, blogs, and social media posts) that does not explicitly promote a brand but is intended to stimulate interest in its products or services.3

So how can Machine Learning drive SEO for your website? One primary use case is through listening for, identifying and even predicting the optimal keywords to drive users to your site. This process may look similar to the one below:

Listen for new SEO Keywords
. Many Natural Language Processing tools, and more specifically Sentiment Analysis tools, that harness Machine Learning algorithms enable business users to track what potential customers are saying and how they feel about social media content over time. Analyses run on these findings can allow users to identify common noun-based or emotion-based keywords to display in prominent locations on their landing and internal pages to increase SEO scores.

Seek out connections between SEO Keywords. Cluster analyses automated through Machine Learning can allow business users to find hidden connections between SEO keywords. Following social listening tool utilization, business users can visualize groupings of emergent noun-based or emotion-based keywords to determine connections between keywords. Doing so can empower business users to organize their landing and other pages across popular keyword clusters to increase site SEO score.

Predict the future of SEO Keywords.
Time Series analysis and forecasting allows business users to utilize machine learning to take a look into the future of SEO your business. By harnessing Machine Learning libraries like TensorFlow, business users can leverage exiting natural language and sentiment data gleaned from social and other digital sources to look for cyclical or other patterns in keyword common noun-based or emotion-based keywords. These predictive capabilities can allow business users to leverage adaptability to constantly stay ahead of the competition through SEO leadership.

So how can business users tap machine learning capabilities including natural language processing / sentiment analysis, cluster analysis and time series analysis on their own data with one easy-to-use platform? Synaptik. The Synaptik platform also provides data management and visualization, as well as customized plugins customized to the needs of business users. Sign up for a 30 minute consultation and we can show you what customers are saying about your products and services across multiple social media channels online (Facebook, Twitter, LinkedIn, etc.).




By Joe Sticca

Resiliency Tech: A Signal in the Storm

Redundancy is a four-letter word in most settings, but when it comes to emergency management and disaster relief, redundant systems reduce risk and saves lives. Tropical Storm Harvey caused at least 148,000 outages for internet, tv and phone customers, making it impossible for people to communicate over social media and text. In this blog post, we explore innovative ways smart cities can leverage big data and Internet of Things (IoT) technology to MacGyver effective solutions when go-to channels breakdown.

Flood Beacons

Designer Samuel Cox created the flood beacon to share fast and accurate flood condition information. Most emergency management decisions are based on forecasts and person-to-person communications with first responders and people in danger. With the flood beacon, you can find out water levels, GPS coordinates and water movements in real-time. The beacon is designed to have low power requirements and use solar to stay charged up. Now, it will be up to the IoT innovators of the world to turn the flood beacon into a complete solution that can broadcast emergency center locations and restore connectivity to impacted areas.

EMS Drones

The Health Integrated Rescue Operations (HiRO) Project has developed a first responder drone that can drop medical kits, emergency supplies and Google Glass for video conference communication. “EMS response drones can land in places that EMS ground vehicles either cannot get to or take too long to reach”, says Subbarao, a recognized expert in disaster and emergency medicine. “Immediate communications with the victims and reaching them rapidly with aid are both critical to improve outcomes.” – One of These Drones Could Save Your Life – Jan.12.2017 via NBC News.

Big Data Analytics and Business Intelligence

Emergency management agencies and disaster relief organizations have been using crowdsourcing and collaborative mapping tools to target impact areas but poor data quality and the lack of cross-agency coordination continue to challenge the system. Business intelligence platforms that provide access to alternative data sets and machine learning models can help government agencies and disaster relief organizations corroborate and collaborate. By introducing sentiment analysis, keyword search features and Geotags, organizations can quickly identify high-need areas. Furthermore, BI platforms with project management and inventory plug-ins can aggregate information and streamline deployment.

Smart emergency management systems must be flexible, redundant and evolve with our technology. At True Interaction, we believe that traditional private sector business intelligence tools and data science capabilities can help cross-agency collaboration, communication and coordination. Our core team of software developers is interested in teaming up with government agencies, disaster relief organizations and IoT developers to create better tools for disaster preparation and relief service delivery. Contact us here if you are interested in joining our resiliency tech partnership!

Can Artificial Intelligence Catalyze Creativity?

In the 2017 “cerebral” Olympic games, artificial intelligence defeated the human brain in several key categories. Google’s AlphaGo beat the best player of Go, humankind’s most complicated strategy game; algorithms taught themselves how to predict heart attacks better than the AHA (American Heart Association); and Libratus, an AI built by Carnegie Mellon University, beat four top poker players at no-limit Texas Hold ‘Em. Many technologists agree that computers will eventually outperform humans on step-by-step tasks, but when it comes to creativity and innovation, humans will always be a part of the equation.

Inspiration, from the Latin inspiratus, literally means “breathed into.” It implies a divine gift – the aha moment, the lightning bolt, the secret sauce that can’t be replicated. Around the globe, large organizations are attempting to reculture their companies to foster innovation and flexibility, two core competencies needed to survive the rapid-fire rate of change. Tom Agan’s HBR article titled “The Secret to Lean Innovation” identified learning as the key ingredient, while Lisa Levey believes that seeing failure as a part of success is key.

At the same time, although innovation is a human creation, machines do play a role in that process. Business leaders are using AI and advanced business intelligence tools to make operations more efficient and generate higher ROI, but are they designing their digital ecosystems to nurture a culture of innovation? If the medium is the message, then they should be.

“If you want to unlock opportunities before your competitors, challenging the status quo needs to be the norm, not the outlier. It will be a long time if ever before AI replaces human creativity, but business intelligence tools can support discovery, collaboration and execution of new ideas.” – Joe Sticca, COO at Synaptik

So, how can technology augment your innovation ecosystem?


New business intelligence tools can help you manage innovation, from sourcing ideas to generating momentum and tracking return on investment. For instance, to prevent corporate tunnel vision, you can embed online notifications that superimpose disruptive questions on a person’s screen. With this simple tool, managers can help employees step outside the daily grind to reflect on the larger questions and how they impact today’s deliverable.


The market is flooded with collaboration tools that encourage employees to leverage each other’s strengths to produce higher quality deliverables. The most successful collaboration tools are those that seamlessly fit into current workflows and prioritize interoperability. To maximize innovation capacity, companies can use collaboration platforms to bring more diversity to the table by inviting external voices including clients, academics and contractors into the process.


Social listening tools and sentiment analysis can provide deep insights into the target customer’s needs, desires and emotional states. When inspiration strikes, innovative companies are able to prototype ideas quickly and share those ideas with the digital universe to understand what sticks and what stinks. By streamlining A/B testing and failing fast and often, agile companies can reduce risk and regularly test their ideas in the marketplace.

While computers may never birth the aha moments that drive innovation, advanced business intelligence tools and AI applications can capture sparks of inspiration and lubricate the creative process. Forward-thinking executives are trying to understand how AI and advanced business intelligence tools can improve customer service, generate higher ROI, and lower production costs. Companies like Cogito are using AI to provide real-time behavioral guidance to help customer service professionals improve the quality of their interactions while Alexa is using NLP to snag the full-time executive assistant job in households all over the world.

Creativity is the final frontier for artificial intelligence. But rather than AI competing against our innovative powers, business intelligence tools like Synaptik can bolster innovation performance today. The Synaptik difference is an easy user interface that makes complex data management, analytics and machine learning capabilities accessible to traditional business users. We offer customized packages that are tailored to your needs and promise to spur new ideas and deep insights.

By Nina Robbins

Evolution of Big Data Technologies in the Financial Services Industry

Our previous post provides an industry analysis that examines the maturity of banking and financial markets organizations. The significant deviations from the traditional business model within the financial services industry in the recent years emphasize the increasing need for a difference in how institutions approach big data. The long-standing industry, so firmly entrenched in its decades-long practices, is seemingly dipping its toes into the proverbial pool of big data as organization recognize that its implementation is integral to a firm’s survival, and ultimately its growth. IBM’s Big Data @ Work survey reports that 26 percent of banking and financial markets companies are focused on understanding the concepts surrounding big data. On the other end of the spectrum, 27 percent are launching big data pilots, but the majority of the companies surveyed in this global study (47 percent) remains in the planning stage of defining a road map towards the efficient implementation of big data. For those organizations still in the stage of planning and refinement, it is crucial to understand and integrate these observed trends within financial technologies that can bolster a company’s big data strategy.

Customer Intelligence

While banks have historically maintained the monopoly on their customer’s financial transactions, the current state of the industry, with competitors flooding the market on different platforms, prevents this practice to continue. Banks are being transformed from product-centric to customer-centric organizations. Of the survey respondents with big data efforts in place, 55 percent report customer-centric objectives as one of their organization’s top priorities, if not their utmost aim. In order to engage in more customer-centric activities, financial service companies need to enhance their ability in anticipating changing market conditions and customer preferences. This will in turn inform the development and tailoring of their products and services towards the consumer, swiftly seizing market opportunities as well as improving customer service and loyalty.

Machine Learning

Financial market firms are increasingly becoming more aware of the many potential applications for machine learning and deep learning, two of the most prominent uses being within the fraud and risk sectors of this industry. The sheer volume of consumer information collected from the innumerable amount of transactions conducted through a plethora of different platforms daily calls for stronger protocols around fraud and risk management. Many financial services companies are just beginning to realize the advantageous inclusion of machine learning within an organization’s big data strategy. One such company is Paypal, which, through a combination of linear, neural network, and deep learning techniques, is able to optimize its risk management engines in order to identify the level of risk associated with a customer in mere milliseconds. The potential foreshadowed by these current applications is seemingly endless, optimistically suggesting the feasibility of machine learning algorithms replacing statistical risk management models and becoming an industry standard. The overall value that financial institutions can glean from the implementation of machine learning techniques is access to actionable intelligence based on the previously obscured insights uncovered by means of such techniques. The integration of machine learning tactics will be a welcome catalyst in the acceleration towards more real-time analysis and alerting.


When attempting to chart the future of financial technology, many point to the Internet of Things (IoT) as the next logical step. Often succinctly described as machine-to-machine communication, the IoT is hardly a novel concept, with the continual exchange of data already occurring between “smart” devices despite the lack of human interference. As some industries, such as in retail and manufacturing, already utilize this technology to some extent, it is not a far-fetched notion to posit that the financial service industry will soon follow suit. While there are those who adamantly reject the idea due to the industry being in the business of providing services as opposed to things, this would be a dangerously myopic view in this day and age. Anything from ATMs to information kiosks could be equipped with sensing technology to monitor and take action on the consumer’s’ behalf. Information collected from real-time, multi-channel activities can aid in informing how banks provide the best, most timely offers and advice to their customers.

For more information to empower your data science initiatives please visit us at We pride ourselves to empower every day users to do great data discovery without the need for deep core technical development skills.

Joe Sticca, Chief Operating Officer of True Interaction, contributed to this post.

By Justin Barbaro

Achieving Continuous Business Transformation Through Machine Learning

In the past, I’ve blogged about applying Agile methodology to businesses at large: what we are seeing in business today is that the concept of “business transformation” isn’t something that is undergone once or even periodically – business transformation is becoming a continuous process.

Today, businesses at large – not just their creative and development silos – benefit from operating in an Agile manner, most importantly in the area of responding to change over following a plan. Consider the words of Christa Carone, chief marketing officer for Xerox:

Where we are right now as an enterprise, we would actually say there is no start and stop because the market is changing, evolving so rapidly. We always have to be aligning our business model with those realities in the marketplace.”

This is an interesting development, but how, technically, can businesses achieve True Interaction in a continually transforming world?

Business Transformation: Reliance upon BI and Analytics

In order for an organization’s resources to quickly and accurately make decisions regarding a product, opportunity or sales channel, they must rely upon historic and extrapolated data, provided by their organization’s company’s Data Warehouse/Business Analytics group.

In a 2016 report by ZS Associates that interviewed 448 senior executives and officials across a myriad of industries, 70% of the respondents replied that sales and marketing analytics is already “very important” or “extremely important” to their business’ competitive advantage. Furthermore the report reveals that in just 2 years time, 79% of respondents expect this to be the case.

However, some very interesting numbers reveal cracks in the foundation: Only 12% of the same respondents could confirm that their organization’s BI efforts are able to stay abreast of the continually changing industry landscape. And only 2% believe business transformation in their company has had any “broad, positive impact.”

5 Reasons for lack of BI impact within an organization

1) Poor Data Integration across the business

Many Legacy BI systems include a suite (or a jumbled mess) of siloed applications and databases: There might be an app for Production Control, MRP, Shipping, Logistics, Order Control, for example, with corresponding databases for Finance, Marketing, Sales, Accounting, Management Reporting, and Human Resources – in all, creating a Byzantine knot of data hookups and plugins that are unique, perform a singular or limited set of functions, and are labor intensive to install, scale and upgrade.

2) Data collaboration isn’t happening enough between BI and Business Executives

Executives generally don’t appear to have a firm grasp of the pulse of their BI: only 41% of ZS Associates’ report participants thought that a collaborative relationship between professionals working directly with data analytics and those responsible for business performance exists at their company.

3) Popular Big Data Solutions are still siloed

Consider Ed Wrazen’s critique of Hadoop: During an interview at Computing’s recent Big Data and Analytics Summit, the Vice-President of Product Management at data quality firm Trillium Software revealed:

“My feeling is that Hadoop is in danger of making things worse for data quality. It may become a silo of silos, with siloed information loading into another silo which doesn’t match the data that’s used elsewhere. And there’s a lot more of it to contend with as well. You cannot pull that data out to clean it as it would take far too long and you’d need the same amount of disk storage again to put it on. It’s not cost-effective or scalable.”

4) Data Integration is still hard to do.

Only 44% of ZS Associates’ report ranked their organizations as “good” or “very good” at data aggregation and integration. 39% said that data integration and preparation were the biggest challenges within the organization, while 47% listed this as one of the areas in their organization where its improvement would produce the most benefit.

5) Only part of the organization’s resources can access BI

Back in the day, BI used to be the sole province of data experts in IT or information analyst specialists. Now companies are seeing the benefits of democratizing the access and analysis of data across the organization. Today, a data analyst could be a product manager, a line-of-business executive, or a sales director. In her book Successful Business Intelligence, Cindi Howson, author and instructor for The Data Warehousing Institute (TDWI) famously remarked:

“To be successful with BI, you need to be thinking about deploying it to 100% of your employees as well as beyond organizational boundaries to customers and suppliers… The future of business intelligence centers on making BI relevant for everyone, not only for information workers and internal employees, but also beyond corporate boundaries, to extend the reach of BI to customers and suppliers.”

Business leaders should examine these symptoms in the context of their own organizations.

Is there a solution to these issues?

True Interaction CEO O. Liam Wright has a novel approach to a new kind of BI solution. One that involves machine learning.

“In my 20 years in the business I’ve discovered several worlds in business that never spoke to each other properly, due to siloed information spaces, communications, platforms and people. Today’s fluid world necessitates changing the BI game completely: If you want to have a high level of true interaction between your systems, platforms, customers, internal multiple hierarchical department structures, then you NEED to flip the game around. SQL-based Dashboards are old news; they are so 2001.

You can’t start with a structured, SQL based situation that inevitably will require a lot of change over time – organizations don’t have the IT staff to continually support this kind of situation – it’s too expensive.”

Instead Liam took a different approach from the beginning:

I thought, what if we capitalized on the sheer quantity and quality of data today, and captured data in unstructured (or structured) formats, and put this data into a datastore that doesn’t care what type of data it is. Then – as opposed to expensive rigid, SQL-based joins on data types, instead we implement lightweight “builds” on top of the data. These lightweight builds enable businesses to start creating software experiences off of their base dataset pretty quickly. They also enable organizations to get Business Intelligence right out of the box as soon as they perform a build – dynamic dashboards and data visualizations which can be come much more sophisticated over time, as you pull in and cross-pollinate more data. Then, when the data is further under control, you can create software agents that can assist you in daily processes of data, or agents that get those controls out the door.

What is this BI and Machine Learning marriage, exactly?

So what exactly is Liam describing? Modern Business has just crossed over the threshold into an exciting new space – organizations can routinely implement an integrated Machine Learning component that runs in the background, that can ingest data of all types from any number of people, places, and platforms, intelligently normalize and restructure it so it is useful, run a dynamic series of actions based upon data type and whatever specified situations contexts your business process is in, and create dynamic BI data visualizations out-of-the-box.

True Interaction’s machine learning solution is called SYNAPTIK.

SYNAPTIK involves 3 basic concepts:

DATA: SYNAPTIK can pull in data from anywhere. Its user-friendly agent development framework can automate most data aggregation and normalization processes. It can be unstructured data, from commerce, web, broadcast, mobile, or social media. It can be audio, CRM, apps, or images. It also can pull in structured data for example, from SAP, Salesforce, Google, communications channels, publications, Excel sheets, or macros.

AGENT: A software agent is a program that acts for a user or other program in a relationship of agency, to act on one’s behalf. Agents can be configured to not only distribute data intelligence in flexible ways but also directly integrate and take action in other internal and external applications for quicker transformation to enhance your business processes and goals.

An Agent is composed of two parts: The operator and the controls. Think of an operator as the classic Telephone Operator in the 1940s that manually plugged in and unplugged your calls in the background. SYNAPTIK enables you to see how the operator works:

Operators can be written in several forms, such as Javascript, PHP / cURL, or Python. Organizations can write their own operators, or True Interaction’s development team can write them for you. An Agent also gives the user a control interface – a form field, or drag and drop functionality, in order to add specific assets, or run any variety of functions. In addition SYNAPTIK makes it easy to connect to a REST API, enabling developers to write their own own software on top of it.

BUILD: A build simply brings the DATA & AGENT components together, ultimately to enable you to better understand your organizations various activities within your data space.

A new model of BI: What is the Return?

– Machine Learning Platforms like SYNAPTIK enables organizations to create wide and deep reporting, analytics and machine learning agents without being tied to expensive specific proprietary frameworks and templates, such as Tableau. SYNAPTIK allows for blending of internal and external data in order to produce new valuable insights. There’s no data modeling required to drop in 3rd party data sources, so it is even possible to create reporting and insight agents across data pools.

– With traditional methods, the process of data normalization requires hours and hours of time – indeed the bulk of time is spent on this today. This leaves very little time for deep analysis and no time for deep insight. With SYNAPTIK, what takes 400 monthly hours to data manage now takes 2 minutes, opening up 399.99 hours of analysis, discovery, and innovation time to deliver the results you need.

– Not only is it possible create your own custom reports/analytic agents, SYNAPTIK enables organizations to share their reporting agents for others to use and modify.

– The inherent flexibility of SYNAPTIK enables businesses to continually provide new data products/services to their customers.

– Not so far down the line: Establishment of The Synaptik Marketplace, where you can participate, monetize, and generate additional revenue by allowing others to subscribe to your Agents and/or Data.

All of these returns contribute to not only augmenting organizational leadership and innovation throughout its hierarchy, but also producing incredibly valuable intelligence monetization, break-thru revenue, as well as improved “client stickiness” with the rollout of new data products and services. And, best of all, it puts businesses into a flexible data environment that does, quite elegantly, enable continuous transformation as industries, markets, and data landscapes continue to change.

We’ve got the Experience you Need

True Interaction has logged 100,000’s of hours of research, design, development, and deployment of consumer products and enterprise solutions. Our services directly impact a variety of industries and departments through our deep experience in producing critical business platforms.

We Can Integrate Anything with… Anything
Per the endless variable demands of each of our global clients, TI has seen it all, and has had to do it all. From legacy systems to open source, we can determine the most optimal means to achieve operational perfection, devising and implementing the right tech stack to fit your business. We routinely pull together disparate data sources, fuse together disconnected silos, and do exactly what it takes for you to operate with tight tolerances, making your business engine hum. Have 100+ platforms? No problem. Give us a Call.

by Michael Davison

AI and the Classroom: Machine Learning in Education


For years schooling has been typified by its aspect of the physical grind on the part of both students and their teachers: teachers cull and prepare educational materials, manually grade students’ homework, and provide feedback to the students (and the students’ parents) on their learning progress. They may be burdened with an unmanageable number of students, or a wide gulf of varying student learning levels and capabilities in one classroom. Students, on the other hand, have generally been pushed through a “one-size-fits-all” gauntlet of learning, not personalized to their abilities, needs, or learning context. I’m always reminded by this quote by world-renowned education and creativity expert Sir Ken Robinson:

“Why is there this assumption that we should educate children simply according to how old they are? It’s almost as if the most important thing that children have in common is their date of manufacture.”

But as the contemporary classroom has become more and more digitized, we’ve seen recent advances in AI and machine learning that are closing in on being able to finally address historical “hand-wrought” challenges – by not only collecting and analyzing data that students generate (such as e-learning log files) when they interact with digital learning systems, but by pulling in large swaths of data from other areas including demographic data of students, educator demographic and performance data, admissions and registration info, human resources information, and so forth.

Quick Review: What is Machine Learning?

Machine learning is a method of data analysis that automates analytical model building. Using algorithms that iteratively learn from data, machine learning allows computers to find hidden insights without being explicitly programmed where to look Machine learning works especially well prediction and estimation when the following are true:

-The inputs are well understood. (You have a pretty good idea of what is important but not how to combine them.)
-The output is well understood. (You know what you are trying to model.)
-Experience is available. (You have plenty of examples to train the data.)

The crucible of machine learning consists of capturing and maintaining a rich set of data, and bringing about the serendipitous state of knowledge discovery: the process of parsing through the deluge of Big Data, identifying meaningful patterns within it, and transforming it into a structured knowledge base for future use. As long as the data flows, its application is endless, and we already see it everywhere, from Facebook algorithms to self-driving cars. Today, let’s examine machine learning and its implementation in the field of Education.

Application of Machine Learning in Education


A few years ago, Sotiris Kotsiantis, mathematics professor at the University of Patras, Greece presented a novel case study describing the emerging field of educational data mining, where he explored using students’ key demographic characteristic data and grading data in a small number of written assignments as the data set for a machine learning regression method that can be used to predict a student’s future performance.

In a similar vein, GovHack, Australia’s largest open government and open data hackathon included several projects in the education space, including a project that aims to develop a prediction model that can be used by educators, schools, and policy makers to predict the risk of a student to drop out of school.

Springboarding from these two examples, IBM’s Chalapathy Neti recently shared IBM’s vision of Smart Classrooms: cloud-based learning systems that can help teachers identify students who are most at risk of dropping out, why they are struggling, as well as provide insight into the interventions needed to overcome their learning challenges:

The system could also couple a student’s goals and interests with data on their learning styles so that teachers can determine what type of content to give the student, and the best way to present it. Imagine an eighth grader who dreams of working in finance but struggles with quadratic and linear equations. The teacher would use this cognitive system to find out the students learning style and develop a plan that addresses their knowledge gaps.

Process efficiency: Scheduling, grading, organization

Elsewhere, several Machine Learning for Education ICML (international machine learning conference) workshops have explored novel machine learning applications designed to benefit the education community, such as:

-Learning analytics that build statistical models of student knowledge to provide computerized and personalized feedback on learning the students’ progress and their instructors
-Content analytics that organize and optimize content items like assessments, textbook sections, lecture videos, etc.
-Scheduling algorithms that search for an optimal and adapted teaching policy that helps students learn more efficiently
-Grading systems that assess and score student responses to assessments and computer assignments at large scale, either automatically or via peer grading
-Cognitive psychology, where data mining is becoming a powerful tool to validate the theories developed in cognitive science and facilitate the development of new theories to improve the learning process and knowledge retention
-Active learning and experimental design, which adaptively select assessments and other learning resources for each student individually to enhance learning efficiency

Existing Platforms

Recently, digital education venture capitalist Tom Vander Ark shared 8 different areas where leading-edge platforms are already leveraging machine learning in education:

1. Content analytics that organize and optimize content modules:
a. Gooru , IBM Watson Content Analytics

2. Learning analytics that track student knowledge and recommend next steps:
a. Adaptive learning systems: DreamBox, ALEKS, Reasoning Mind, Knewton
b. Game-based learning: ST Math, Mangahigh

3. Dynamic scheduling matches students that need help with teachers that have time:
a. NewClassrooms uses learning analytics to schedule personalized math learning experiences.

Grading systems that assess and score student responses to assessments and computer assignments at large scale, either automatically or via peer grading:
a. Pearson’s WriteToLearn and Turnitin’s Lightside can score essays and detect plagiarism.

5. Process intelligence tools analyze large amounts of structured and unstructured data, visualize workflows and identifying new opportunities:
a. BrightBytes Clarity reviews research and best practices, creates evidence-based frameworks, and provides a strength gap analysis.
b. Enterprise Resource Planning (ERP) systems like Jenzabar and IBM SPSS helps HigherEd institutions predict enrollment, improve financial aid, boost retention, and enhancing campus security.

6. Matching teachers and schools:
a. MyEdMatch and TeacherMatch are eHarmony for schools.

7. Predictive analytics and data mining to learn from expertise to:
a. Map patterns of expert teachers
b. Improve learning, retention, and application.

8. Lots of back office stuff:
a. EDULOG does school bus scheduling
b. Evolution , DietMaster.


As the modern classroom becomes more and more digitized, we are able to gather myriad sets of data. The trick is, of course, being able to purpose it. The prize at heart of machine learning is knowledge discovery, the process of parsing through the deluge of Big Data, identifying meaningful patterns within it, and transforming it into a structured knowledge base for future use. In this article, we’ve seen examples utilizing machine learning in the education sector for prediction, scheduling, grading, and organization. We’ve also listed existing education-related platforms that use a machine learning component.

What does it mean to me?

Big Data have swept into every industry and business function and are now an important factor in production, alongside labor and capital. In a decision making system, the bigger the data, the higher the likelihood is of making good decisions. The time is now for organizations, in education or otherwise, to research how a cost-efficient machine learning component can transform your operational output. For more information, Check out this detailed guide by Jesse Miller on the amazing benefits of technology in the classroom and suggestions on ways to incorporate technology in the classroom.

“Parents are continually exposed to new technology via their children. Whether it be iPad App usage tricks, to the advent of robotics competitions, and perhaps now “new ways of thinking” as a result of interaction with Machine Learning based educational environments. Siloed educational content may give way to a topology of learning experiences.” O. Liam Wright – CEO, True Interaction

True Interaction produces custom full-stack end-to-end technology solutions across web, desktop and mobile, integrating multiple data sources to create a customized data solution. True Interaction can determine the most optimal means to achieve operational perfection, devising and implementing the right tech stack to fit the specific school and or district need. True Interaction pulls together disparate data sources, fuses together disconnected silos, and does exactly what it takes for school data systems to operate with high levels of efficiency and efficacy, ultimately leading to improved student achievement outcomes.