Those are the findings of a study from Hewlett-Packard, whose Fortify on Demand security division tested 10 popular smartwatches. The company is in the process of alerting vendors about the flaws and can't disclose the watches it tested, said Daniel Miessler, practice principal at HP.
HP also examined the security around the Web interfaces and mobile apps that accompany smartwatches and allow a person to access the device as well as how data gathered by watch apps is protected and used.
The study found vulnerabilities with each of the watches and raised concerns over user authentication methods, data encryption and data privacy, among other issues.
Only half of the watches HP tested let users lock the device's screen, potentially allowing a stranger to access their sensitive information if the wearable was lost or stolen. Smartwatch sensors collect health data, including heart beats, and the devices store personal details like a person's name, address and birth date. Two of the watches that lack the ability to lock a screen could be paired with a smartphone other than the owner's, giving an attacker access to the wearable's data.
The smartwatches fell short when it came to encrypting data that's sent to the cloud. While the wearables used SSL and TLS security protocols to encrypt information, some relied on SSL 2.0, an older version of the protocol that's known to have security flaws. Additionally, 40 percent of the watches were vulnerable to POODLE attacks.
In some cases, vendors prioritize getting smartwatches on the market over security so measures like data encryption are overlooked. Others don't realize the dangers of transmitting data in clear text form, Miessler said.
HP questioned if there's enough transparency around how data collected by watch apps is used, saying people and app developers may not realize that information ends up on a "substantial" number of servers. This provides attackers more access points to the data, either by intercepting it in transit or going after the servers where it's stored, Miessler said. Some of the places where smartwatch data ended up included advertising and analytics networks, he added.
HP took issue with how seven of the 10 the watches processed firmware updates. Those devices were sent unencrypted updates, and while they were signed to prevent malicious files from being uploaded, this didn't prevent them from being downloaded and viewed by others.
Consumers may not realize they need to be aware of security issues around mobile apps and Web interfaces used to access smartwatches, Miessler said.
"It's not just a smartwatch. It's the ecosystem around it," he said.
Of the devices HP tested, three had Web interfaces and mobile apps that could be used to access the smartwatch. HP said the password-creation requirements for these systems weren't complex. Additionally, the interfaces and apps didn't lock out people after they entered the wrong password multiple times and lacked two-factor authentication. When paired with accounting-harvesting tactics, which cull the Web for information on people, these weaknesses could allow an attacker to use brute force attacks to figure out a person's password, HP said.
As smartwatch adoption grows, HP predicted the devices will become appealing targets for hackers since people will store sensitive information on them such as data for making purchases or even unlocking their homes' doors.
But those security concerns, as well as how smartwatch app data is used, aren't on a person's mind when they purchase a wearable, Miessler said. Instead, they're looking at what features, like fitness and wellness monitoring, a smartwatch offers.
"They're thinking what kind of health data can you get from the wearable, not where is the health data going," he said.
Fred O'Connor writes about IT careers and health IT for The IDG News Service. Follow Fred on Twitter at @fredjoconnor. Fred's e-mail address is fred_o'connor@idg.com