From Bones to Connected Machines: The Curious Evolution of Big Data
WHY YOU SHOULD CARE
Because understanding our future potential starts with recognizing our past.
OZY and Predix from GE — the cloud-based development platform built for industry — have partnered to bring you an inside look at the future of digital industries, where people, data and productivity meet.
Using data to make faster, smarter decisions has a long and winding road through human history (see our special slideshow above). Starting back in 18,000 B.C. it sounds like a scene straight out of The Flintstones: using bones to forecast how long food supplies would last. But that’s how Paleolithic tribespeople monitored trading activities in what is now Uganda and the Democratic Republic of the Congo. Made from baboon fibulae, these “tally sticks” have since become the oldest evidence that humans practiced arithmetic. Simple math, perhaps, but these seemingly primitive computations made use of small data to determine something pretty important: the tribe’s future. Think of it as big data’s little bro.
And then came the first “machines.” In the 2nd century B.C.E., the primitive Antikythera mechanism of cogs and gears — known as the world’s oldest computer — served not only as a calendar and a predictor of solar and lunar eclipses, but it also came with a bonus feature: a dial for anticipating the cycle of the Olympic games. Using data to improve daily life was big even back in 600 B.C. Our ancestors were busy analyzing information about crop yield and fallow farming to tweak strategies; these efforts boosted outputs and fed more people. In the 15th century, during the start of the Age of Exploration, individuals looked to the skies to navigate the world, opening the high seas to global trade. And by the 16th century, the combination of empirical observations gathered in oceanic seafaring and mapping of winds and currents helped pave the way for Portuguese explorers to chart a network of ocean routes covering the Atlantic, the Indian and the western Pacific oceans.
In 1926, inventor and futurist Nikola Tesla predicted that one day, “the whole Earth will be converted into a huge brain … and the instruments that we shall [use] will be amazingly simple compared to our present telephone.” And, oh, yeah, he added: “A man will be able to carry one in his vest pocket.”
Fast-forward to the Industrial Revolution, when the age of machines replaced our humble human hands — when steam power, more precise tools and the production of interchangeable parts revolutionized manufacturing. Assembly lines and electricity then propelled businesses into mass production in the 20th century. And here’s when we really started paying attention to making machines and data work together to improve lives and businesses. Like in 1965, when the U.S. government came up with a secret plan for the world’s first data center, placing about 742 million tax returns and 175 million sets of fingerprints on magnetic tape. At the same time, health-care providers started using electronic health records, allowing doctors to view patient info anywhere at anytime. Some first real forays into cloud computing.
The ’70s ushered in the Digital Revolution, where computers and automation dramatically increased efficiencies and improved bottom lines everywhere. And began what was to become a huge explosion of data that continued to grow exponentially. Imagine … in 1993, only 3 percent of the world’s data was stored digitally — on clunky hard drives and optical disks. Fifteen years later, it was a whopping 94 percent. By 2009, nearly all companies in the U.S. with 1,000+ employees had at least an average of 200 terabytes of stored data. And by 2010, there was enough to fill a stack of DVDs that could stretch from the Earth to the moon and back — by 2020, that stack could reach Mars, experts predict.
Now we find ourselves landing in a new age of digital industrial where the Internet of Things, the cloud and cyberphysical systems are catapulting data analysis to unprecedented levels — which is having a huge impact on business. In 2012, companies using data-directed decision-making reported a 5 to 6 percent jump in productivity. And economists estimate that if harnessing data makes industries just 1 percent more efficient, we could add nearly $15 trillion to global GDP by 2030 — the equivalent of adding another U.S. economy. Another example: Sensors on engines, flaps and landing gear are improving aircraft performance, resulting in reduced turbulence and improved safety. And that’s just scratching the surface of today’s technological innovation.
Where will big data and connected machines take us next? Implantable technology is just one area, and it’s one that’s becoming an increasingly important part of a broader digital health-care revolution that, hopefully, will help us live longer (or at least not have to schedule as many doctor appointments). Predictive modeling will forecast disease outbreaks and determine the best places to build roads. Data innovation and digital industry will change the way we fly, eat, build new products and uncover innovative new ways to streamline business operations to churn out more in less time.
Predix from GE is enabling the adoption of powerful, secure and scalable solutions built for the industrial app economy. It’s industrial-strength strength, powering the future of industry. Get Connected.
- Neil Parmar