remixtures

joined 2 years ago
[–] [email protected] 1 points 13 hours ago

"The utility of the activity data in risk mitigation and behavioural modification is questionable. For example, an actuary we interviewed, who has worked on risk pricing for behavioural Insurtech products, referred to programs built around fitness wearables for life/health insurance, such as Vitality, as ‘gimmicks’, or primarily branding tactics, without real-world proven applications in behavioural risk modification. The metrics some of the science is based on, such as the BMI or 10,000 steps requirement, despite being so widely associated with healthy lifestyles, have ‘limited scientific basis.’ Big issues the industry is facing are also the inconsistency of use of the activity trackers by policyholders, and the unreliability of the data collected. Another actuary at a major insurance company told us there was really nothing to stop people from falsifying their data to maintain their status (and rewards) in programs like Vitality. Insurers know that somebody could just strap a FitBit to a dog and let it run loose to ensure the person reaches their activity levels per day requirement. The general scepticism (if not broad failure) of products and programs like Vitality to capture data useful for pricing premiums or handling claims—let alone actually induce behavioural change in meaningful, measurable ways—is widely acknowledged in the industry, but not publicly discussed."

https://www.sciencedirect.com/science/article/pii/S0267364924001614

 

"Barcelona has become the cyber offensive capital of Europe, and Haaretz has learned that at least three teams of Israeli researchers focused on facilitating advanced hacking capabilities have relocated to the Catalonian capital in the past year and a half, the latest of them in recent months.
Haaretz Podcast

"There are roughly six such groups of Israelis who are the elite in the field – and half of them have moved to Spain," says an industry executive.

In the past two months, a team of Israeli vulnerability researchers - an industry term for hackers specializing in identifying weaknesses in digital defenses, known as "exploits" - arrived in Barcelona from Singapore. The team specializes in finding breaches in smartphone defenses through which spyware can be remotely installed."

https://www.haaretz.com/israel-news/security-aviation/2024-12-26/ty-article/.premium/israeli-hackers-flock-to-barcelona-as-spyware-industry-shifts/00000193-fec4-df5b-a9b3-fec5d9dc0000

#Israel #CyberSecurity #Hacking #Spyware #Spain #Barcelona

 

"This article uses the case study of an insurance product linked to a health and wellbeing program—the Vitality scheme—as a lens to examine the limited regulation of collection and use of non-personal (de-identified/anonymised) information and the impacts it has on individuals, as well as society at large. Vitality is an incentive-based engagement program that mobilises online assessment tools, preventive health screening, and physical activity and wellness tracking through smart fitness technologies and apps. Vitality then uses the data generated through these activities, mainly in an aggregated, non-personal form, to make projections about changes in behaviour and future health outcomes, aiming at reducing risk in the context of health, life, and other insurance products. Non-personal data has been traditionally excluded from the scope of legal protections, and in particular privacy and data regimes, as it is thought not to contain information about specific, identifiable people, and thus its potential to affect individuals in any meaningful way has been understood to be minimal. However, digitalisation and ensuing ubiquitous data collection are proving these traditional assumptions wrong. We show how the response of the legal systems is limited in relation to non-personal information collection and use, and we argue that irrespective of the (possibly) beneficial nature of insurance innovation, the current lack of comprehensive regulation of non-personal data use potentially leads to individual, collective and societal data harms, as the example of the Vitality scheme illustrates."

https://www.sciencedirect.com/science/article/pii/S0267364924001614

#Australia #HealthInsurance #Anonymization #Privacy #DataProtection #GDPR #Insurance

 

"EFF supporters get that strong encryption is tied to one of our most basic rights: the right to have a private conversation. In the digital world, privacy is impossible without strong encryption.

That’s why we’ve always got an eye out for attacks on encryption. This year, we pushed back—successfully—against anti-encryption laws proposed in the U.S., the U.K. and the E.U. And we had a stark reminder of just how dangerous backdoor access to our communications can be."

https://www.eff.org/deeplinks/2024/12/defending-encryption-us-and-abroad

#Encryption #USA #UK #CyberSecurity #EU #Surveillance #Privacy #DigitalRights

 

"North Korea-linked hackers stole more from cryptocurrency platforms this year than ever before, according to Chainalysis Inc., showcasing rising capabilities that researchers say threaten US national security.

Digital thieves linked to North Korea utilize advanced methods such as manipulating remote work opportunities and are responsible for more than half of the total $2.2 billion stolen from platforms in 2024, the blockchain analytics company said in a report Thursday. North Korean-affiliated groups stole $1.34 billion in 47 incidents 2024, up from $660.5 million across 20 incidents in 2023, according to the company’s findings."

https://www.bloomberg.com/news/articles/2024-12-19/north-korean-hackers-stole-record-1-3-billion-in-crypto-in-2024

#Crypto #CryptoCurrencies #NorthKorea #CyberCrime #CyberSecurity

 

"Capitalist and technology-enabled surveillance has moved beyond targeting users with ads to targeting their lives. This is why privacy online today means freedom tomorrow. Protecting our privacy secures our fundamental rights for the future.

I will be honest, it can be overwhelming; however, in times like this, I like to focus on what can be done instead of worrying about what hasn't happened yet. The most important thing is to act, no matter how difficult it can be during times of fear and stress. Pushing for incremental change and improvements requires small actions every day. We have to engage the folks that are willing to join our fight, pave the way for those actions, and build the communities we want collectively.

There is a lesson to be learned from merging with Tails in 2024 and our growth in the last several years: together we are stronger. And in 2025, I want to use this lesson as a guiding principle, that solidarity and collaboration are our greatest strengths."

https://blog.torproject.org/tor-in-2024/

#Tor #Anonymity #Privacy #Surveillance

 

"The European Commission is proposing regulation under its initiative for digitalising travel documents. We are responding to a consultation that is open to the public and highlighting problems with the proposal. In this article we provide background to the initiative and highlight problems that could put fundamental rights at risk, such as a new, secretive biometric surveillance infrastructure to implement to proposed system.

The European Commission (EC) has presented two proposals in the context of their initiative for digitalising travel documents. This initiative includes a “Proposal for a Regulation establishing an application for the electronic submission of travel data (“EU Digital Travel application”) […] as regards the use of digital travel credentials” (2024/0670 (COD)) – henceforth “the proposal” or the “the travel app” and subject of this article and of our consultation feedback – and a “Proposal for a Council Regulation establishing an identity card-based digital travel credential”. A previous public consultation on the goals of this initiative, prior to the presentation of the legislative proposals, received overwhelmingly negative feedback."

https://edri.org/our-work/pre-travel-controls-digitalising-travel-documents/

#EU #EC #Surveillance #Biometrics #DataProtection #Privacy #DigitalRights

 

"Italy's data protection authority has fined ChatGPT maker OpenAI a fine of €15 million ($15.66 million) over how the generative artificial intelligence application handles personal data.

The fine comes nearly a year after the Garante found that ChatGPT processed users' information to train its service in violation of the European Union's General Data Protection Regulation (GDPR).

The authority said OpenAI did not notify it of a security breach that took place in March 2023, and that it processed the personal information of users to train ChatGPT without having an adequate legal basis to do so. It also accused the company of going against the principle of transparency and related information obligations toward users.

"Furthermore, OpenAI has not provided for mechanisms for age verification, which could lead to the risk of exposing children under 13 to inappropriate responses with respect to their degree of development and self-awareness," the Garante said.

Besides levying a €15 million fine, the company has been ordered to carry out a six-month-long communication campaign on radio, television, newspapers, and the internet to promote public understanding of how ChatGPT works.

This specifically includes the nature of data collected, both user and non-user information, for the purpose of training its models, and the rights that users can exercise to object, rectify, or delete that data."

#AI #GenerativeAI #EU #Italy #OpenAI #DataProtection #Privacy #ChatGPT

 

"The law does not specify which social media platforms will be banned. Instead, this decision is left to Australia’s communications minister who will work alongside the country’s internet regulator, the eSafety Commissioner, to enforce the rules. This gives government officials dangerous power to target services they do not like, all at a cost to both minor and adult internet users.

The legislation also does not specify what type of age verification technology will be necessary to implement the restrictions but prohibits using only government IDs for this purpose. This is a flawed attempt to protect privacy.

Since platforms will have to provide other means to verify their users' ages other than by government ID, they will likely rely on unreliable tools like biometric scanners. The Australian government awarded the contract for testing age verification technology to a UK-based company, Age Check Certification Scheme (ACCS) who, according to the company website, “can test all kinds of age verification systems,” including “biometrics, database lookups, and artificial intelligence-based solutions.”"

https://www.eff.org/deeplinks/2024/12/australia-banning-kids-social-media-does-more-harm-good

#Australia #SocialMedia #AgeVerification #Surveillance #Privacy #DataProtection

 

"The findings, presented in November in Madrid at the Internet Measurement Conference (IMC 2024) and published in the Proceedings of the 2024 ACM on Internet Measurement Conference, highlight the frequency with which these screenshots are transmitted to the servers of the brands analyzed: Samsung and LG. Specifically, the research showed that Samsung TVs sent this information every minute, while LG devices did so every 15 seconds.

"This gives us an idea of the intensity of the monitoring and shows that smart TV platforms collect large volumes of data on users, regardless of how they consume content, whether through traditional TV viewing or devices connected via HDMI, like laptops or gaming consoles," Callejo emphasizes.

To test the ability of TVs to block ACR tracking, the research team experimented with various privacy settings on smart TVs. The results demonstrated that, while users can voluntarily block the transmission of this data to servers, the default setting is for TVs to perform ACR."

https://techxplore.com/news/2024-12-smart-tvs-viewing-external-screens.html

#TVs #SmartTVs #Surveillance #DataProtection #Privacy

 

"While I once hoped 2017 would be the year of privacy, 2024 closes on a troubling note, a likely decrease in privacy standards across the web. I was surprised by the recent Information Commissioner’s Office post, which criticized Google’s decision to introduce device fingerprinting for advertising purposes from February 2025. According to ICO, this change risks undermining user control and transparency in how personal data is collected and used. Could this mark the end of nearly a decade of progress in internet and web privacy? It would be unfortunate if the newly developing AI economy started from a decrease of privacy and data protection standards. Some analysts or observers might then be inclined to wonder whether this approach to privacy online might signal similar attitudes in other future Google products, like AI.

I can confidently raise this question, having observed and analyzed this area for over 15 years from various perspectives. My background includes experience in web browser security and privacy, including in standardization. I served in the W3C Technical Architecture Group, and have authored scientific papers on privacy, tracking, and fingerprinting, as well as assessments of technologies like Web APIs. This includes the Privacy Sandbox’s Protected Audience API. I was looking forward to the architectural improvements of web privacy. In other words, I am deeply familiar with this context. The media so far have done a great job bringing attention to the issue, but they frame this development as a controversy between Google’s policy change and the UK ICO’s concerns. I believe that the general public and experts alike would benefit from a broader perspective."

https://blog.lukaszolejnik.com/biggest-privacy-erosion-in-10-years-on-googles-policy-change-towards-fingerprinting/

#Google #Surveillance #AdTracking #Privacy #DataProtection

 

"Microsoft’s Recall feature recently made its way back to Windows Insiders after having been pulled from test builds back in June, due to security and privacy concerns. The new version of Recall encrypts the screens it captures and, by default, it has a “Filter sensitive information,” setting enabled, which is supposed to prevent it from recording any app or website that is showing credit card numbers, social security numbers, or other important financial / personal info. In my tests, however, this filter only worked in some situations (on two e-commerce sites), leaving a gaping hole in the protection it promises.

When I entered a credit card number and a random username / password into a Windows Notepad window, Recall captured it, despite the fact that I had text such as “Capital One Visa” right next to the numbers. Similarly, when I filled out a loan application PDF in Microsoft Edge, entering a social security number, name and DOB, Recall captured that. Note that all info in these screenshots is made up, but I also tested with an actual credit card number of mine and the results were the same."

#Microsoft #MicrosoftRecall #DataProtection #Privacy

https://www.tomshardware.com/software/windows/microsoft-recall-screenshots-credit-cards-and-social-security-numbers-even-with-the-sensitive-information-filter-enabled

 

"After decades of research, models like these are now entering clinical trials and starting to be used for patient care. Virtual replicas of many other organs are also being developed.

Engineers are working on digital twins of people’s brains, guts, livers, nervous systems, and more. They’re creating virtual replicas of people’s faces, which could be used to try out surgeries or analyze facial features, and testing drugs on digital cancers. The eventual goal is to create digital versions of our bodies—computer copies that could help researchers and doctors figure out our risk of developing various diseases and determine which treatments might work best. They’d be our own personal guinea pigs for testing out medicines before we subject our real bodies to them.

To engineers like Niederer, it’s a tantalizing prospect very much within reach. Several pilot studies have been completed, and larger trials are underway. Those in the field expect digital twins based on organs to become a part of clinical care within the next five to 10 years, aiding diagnosis and surgical decision-making. Further down the line, we’ll even be able to run clinical trials on synthetic patients—virtual bodies created using real data.

But the budding technology will need to be developed carefully. Some worry about who will own this highly personalized data and how it could be used. Others fear for patient autonomy—with an uncomplicated virtual record to consult, will doctors eventually bypass the patients themselves? And some simply feel a visceral repulsion at the idea of attempts to re-create humans in silico. “People will say ‘I don’t want you copying me,’” says Wahbi El-Bouri, who is working on digital-twin technologies. “They feel it’s a part of them that you’ve taken.”"

https://www.technologyreview.com/2024/12/19/1108447/digital-twins-human-organs-medical-treatment-drug-trials/

#DigitalTwins #Biotechnology #Health #Privacy #DataProtection

[–] [email protected] 2 points 3 weeks ago

"On Tuesday the Consumer Financial Protection Bureau (CFPB) published a long anticipated proposed rule change around how data brokers handle peoples’ sensitive information, including their name and address, which would introduce increased limits on when brokers can distribute such data. Researchers have shown how foreign adversaries are able to easily purchase such information, and 404 Media previously revealed that this particular data supply chain is linked to multiple acts of violence inside the cybercriminal underground that has spilled over to victims in the general public too.

The proposed rule in part aims to tackle the distribution of credit header data. This is the personal information at the top of a credit report which doesn’t discuss the person’s actual lines of credit. But currently credit header data is distributed so widely, to so many different companies, that it ends up in the hands of people who use it maliciously."

https://www.404media.co/u-s-government-tries-to-stop-data-brokers-that-help-dox-people-through-credit-data/

[–] [email protected] 2 points 3 weeks ago

"The United States government’s leading consumer protection watchdog announced Tuesday the first steps in a plan to crack down on predatory data broker practices that the agency says help fuel scams, violence, and threats to US national security.

The Consumer Financial Protection Bureau is proposing a rule that would allow regulators to police data brokers under the Fair Credit Reporting Act (FCRA), a landmark privacy law enacted more than a half century ago. Under the proposal, data brokers would be limited in their ability to sell certain sensitive personal information, including financial data and credit scores, phone numbers, Social Security numbers, and addresses. The CFPB says that closing the loopholes allowing data brokers to trade in this data with little to no oversight will benefit vulnerable people and the US as a whole."

https://www.wired.com/story/cfpb-fcra-data-broker-oversight/

view more: next ›