9 ways technology will change within the next 10 years

24.03.2016
Ten years ago, there were no smartphones. It was the coffeeshop era of Wi-Fi, which meant that the Internet was just beginning to follow us out the door and into the world. Amazon first released EC2, to some confusion.

Nowadays, of course, Wi-Fi and mobile data are almost ubiquitous, smartphones have hit market saturation in the most developed nations, and EC2 is a cornerstone of modern business IT. The pace of technological progress continues to accelerate, it seems, as entire new product categories change the way we live and do business, and there’s no end in sight.

Here’s our look ahead to 10 years in the future, and how the tech world may change.

Sure, they’ve moved to the cloud and gotten a bit smarter over the years, but the productivity apps we use every day have remained functionally the same since their advent – a word processor is still a word processor, regardless of whether it’s WordStar or Google Docs, and a spreadsheet is still a spreadsheet, be it Lotus or Excel 2013.

However, no less than the inventor of the spreadsheet himself, Dan Bricklin, told Network World that that’s going to change within the next 10 years.

Endpoint form factors are going to be the biggest driver of changes to productivity apps, Bricklin says. What we think of as productivity apps – spreadsheets, word processors and so on – are best used with a reasonably large screen and a keyboard.

But in a world where, increasingly, mobile devices are the way people enter the digital realm, traditional productivity apps don’t work as well.

“The phone, and perhaps the watch and other wearables, those are a different beast – and the tablet is sort of in between,” he says. “So then the question becomes – what would be a productivity tool for somebody in that situation”

Navigating a database while waiting in line at the grocery store, for example, isn’t the way most people use their smartphones, so it’s unlikely to catch on, notes Bricklin, who is currently CTO at Alpha Software.

It may be, in fact, that productivity apps become much more diverse and specialized – rather than directly editing a spreadsheet on a smartphone, for example, a user could simply speak into the device to add data to a system while on the move. Databases of repair information could help auto mechanics and plumbers.

While the state of affairs has improved significantly over the past several years, it’s still an inarguable fact that the tech sector has a diversity problem. Big companies have proclaimed themselves distressed at that fact, and vowed to do something about it, but change has been relatively slow in coming.

The next decade, however, should see substantially more progress being made, thanks to a growing awareness of the issue’s importance and initiatives aimed at making the ranks of university computer science and engineering programs more diverse.

The current situation shows that the tech industry is still noticeably out of step with the rest of the country – women are strongly underrepresented in the industry – just 25% of Intel’s workforce is female, along with 30% of Google’s, 31% of Apple’s and 28% of Microsoft’s. (The biggest employer of women in a listing compiled by informationisbeautiful is Pandora at 49%.)

Black and Latino workers are also startlingly absent from most of the top technology companies. The major tech firm with the highest proportion of black workers is Amazon at 15%. Most others are in the low single digits. Latinos are most present at HP, where they make up 14% of the workforce, and most companies are also in single digits.

But there are hopeful signs. Intel recently went public with its diversity figures, which were generally poor, but has vowed to accurately reflect the makeup of the U.S. by 2020, and companies may be beginning to realize the value of diversity along both gender and ethnic lines.

To understand where Cisco might be in 10 years, it helps to look back 10 years.

At this time in 2006, Cisco was in the third quarter of its 2006 fiscal year. The Catalyst 6500 switch had achieved $20 billion in revenue over its seven-year lifetime; the first generation CSR-1 service provider router – dubbed “HFR,” for Huge Effin’ Router – was two years old and had 60 big customers worldwide; and Cisco IP phones on business desktops numbered 2 million.

+ ALSO Read John Chambers' take on what the network was like 30 years ago +

Now, the Catalyst campus base is being replaced by the Catalyst 6800 while data centers are transitioning to the Nexus 9000. The CSR-1 has seen three more generations, capped off and succeeded by the Network Convergence System (NCS) series. And Cisco has shipped well in excess of 50 million IP phones, and emphasizes cloud-based software as for unified communications infrastructure-as-a-service.

Expect the emphasis on software to continue and ultimately transform Cisco from the leading provider of networking hardware into a software company. Expect its “products” to be delivered as services priced on a perpetual or subscription-based license, hosted, operated and managed from a Cisco cloud. There will still be routers and switches, but in campus networks they will increasingly become thinner and thinner, like the NID to the PSTN in your home; the horsepower of hardware processing and acceleration will be reserved for data centers and clouds.

Expect today’s Nexus 9000 and NCS to be succeeded by one or two generations in 10 years, like the Cat 6500 and CRS-1 were 10 years earlier. Expect Cisco to gain more share in data center servers and conceive of even more ways to hyper-converge compute, networking, storage with applications, services and microservices developed using tiny, semi-autonomous containers. Expect Cisco to reinvigorate on-premises computing by making it more cost effective and secure than off-premises clouds.

Expect Cisco to be the leading provider of connectivity for the Internet of Things.

And expect Cisco to perhaps actually be the No. 1 IT company in the world, a title it currently covets. And expect many of its competitors and combatants to by laying in its wake.

(By Managing Editor Jim Duffy)

“I’m not even sure we’ll still call it ‘cloud’ in 2026, it’s just the way we do IT,” says Cloud Technology Partners’ Senior Vice President and industry pundit David Linthicum.

Over the next decade more and more hyperscale data centers will be built to hold our exponentially increasing production of data and insatiable desire for computing capacity to manage and analyze it. By 2026 we’ll get to a point where our smartphones are ultra-thin client devices that have access to this virtually-unlimited cloud-based compute and storage capacity.

Whereas today organizations are creating their new applications in the cloud, in 10 years the cloud will be the dominant and natural place to host applications. Today’s concerns about security of the cloud will be reversed in a decade: The cloud will be considered the safer place to store data compared to attempting to host it yourself. Companies will store the bulk of their data – more than 100 petabytes will be norm for most businesses – in cloud databases. They’ll have a choice of general-purpose vanilla cloud services or ones that are tailored to their specific vertical industries (a retail cloud, a health care cloud, a finance cloud, for example).

(By Senior Editor Brandon Butler)

Despite the cloud being the natural and dominant place for most applications and data, there will be very little focus on the underlying infrastructure, because that will all be managed by vendors. Instead, end users’ focus will be on ‘smart’ applications and services that take advantage of this ubiquitous infrastructure, predicts industry analyst and strategist Krishnan Subramanian, who hosted an event titled “Cloud 2020” exploring how this technology will change in the future.

Whereas today the basic unit of compute is a virtual or physical server, in 2026 it will be the massive number of connected devices that will be producing data that is stored in the cloud. Cloud systems will have powerful machine learning and artificial intelligence engines that ingest feeds of data being produced by these IoT devices and produce business logic that drives operational decisions.

(By Senior Editor Brandon Butler)

One big risk of data stored in the cloud is that if you need to use it, you have to decrypt it, opening it up to possible attack. But homomorphic encryption should fix that.

This technique encrypts data in such a way that applications can access it and make calculations based on it without actually using the data itself, just the encrypted representation of the data. And the calculations made by the application, once completed, can be encrypted as well.

So, for example, a patient record could be stored, and an application that predicts the patients’ outcomes could be applied to the data. But since the data is never decrypted it is never at risk. Such a scheme would help medical providers meet the requirements of medical-confidentiality laws and regulations.

This example is part of a Microsoft Research paper this year about achieving high throughput, accurate and private manipulation of encrypted data.

(By Senior Editor Tim Greene)

Much has been made of the flourishing of free and open-source software in recent years – collaborative, fluid teams working on open code bases are responsible for a large and growing proportion of the software used in the enterprise world.

And there’s no reason to think that the trend will change anytime soon. Neela Jacques, the executive director of the OpenDaylight SDN project, says that an over-reliance on closed, proprietary systems has hurt business users in the past.

+ ALSO: 7 communities driving open source development +

“Companies have realized that it’s inefficient to try to build proprietary platforms that have a low chance of success,” he says. “What we’re seeing is an emerging model where organizations spend a moderate amount of resources with others to establish a collaborative, standard platform.”

Moreover, according to Jacques, the open-source community will become increasingly professionalized over the next decade – removing one of the barriers to entry for conservative companies worried about the occasional fiery war of words that periodically roils some open-source projects.

“The conversation will move to how we build and maintain the greatest shared technologies of our time, from licensing to certification to training talent to support these resources,” he says.

The Internet of Things is a phenomenon that we’re constantly told is in danger of taking off and really changing the world around us, but it never quite seems to happen. But within the next 10 years the technology will start to realize its potential, according to IDC vice president of network infrastructure Rohit Mehra.

Most of what’s held IoT back can be sorted into two categories – security issues and interoperability problems. The relative novelty of the technology means that there are few generally accepted communications standards, which in turn means that it can be difficult to have multiple IoT devices working together unless they’re all from the same manufacturer. That limits their utility, since the entire point is to have everything working as a seamless whole. Additionally, IoT devices represent potential ways into a network for attackers, and their security isn’t always assured.

But Mehra says that these problems will be solved – for a given degree of “solved” – within the coming decade.

“I think all the pieces are slowly falling into place,” he told Network World. “Today’s network can adapt, it can really scale, and all on the fly as application needs change – and what that means is, now, if I’m an IT guy, I can really think of leveraging the cloud, leveraging my big data and analytics applications, to do what’s best for all my IoT endpoints.”

Big data – if you haven’t heard a lot about it over the past several years, you haven’t been paying much attention. A buzzword shoved into the conversation about every highly scalable technology product, the definition of big data has ballooned so much that it’s beginning to lose meaning.

The impact of the increasingly quantified tech world, however, has been meaningful indeed, and it will continue to have a notable impact on the network into the foreseeable future. Judith Hurwitz, who runs the consulting firm Hurwitz and Associates, says that big data’s role in analytics will be substantial.

Judith Hurwitz

“One of the most important trends that is just beginning to impact the network world is machine learning and cognitive computing,” she writes via email. “The ability to analyze massive amounts of data to look for patterns and anomalies changing the way new tools are able to anticipate problems before they can cause outages or intrusions.”

Watson-esque tools to perform complicated analysis require commensurately humongous data sources, so in a very real way, big data is powering what could be a renaissance in network analysis and management.

Nor is that influence limited to the network by itself. Matt Roberts, marketing director of telecom management firm Amdocs, says that data from other parts of the infrastructure can be built into an analytics solution.

“We’re moving to these new environments [where] … a lot of the decision-making or human intervention can be done through the use of intelligent analytics,” he says.

(www.networkworld.com)

Jon Gold, Network World staff