multiplewolves

joined 8 months ago
[–] [email protected] 1 points 3 hours ago

I’m happy to address your reply.

See, there you go, lost me completely now. "We should be preemptively pissed off about imaginary offenses because you just KNOW these people will eventually get there" is not how we should run our brains, let alone our regulations.

That’s a wildly inaccurate characterization of what I said. I’m trying to get out of this interaction because you misinterpret me and then move the goal posts. You went from “we don’t really know what happened” (which isn’t true) to “my point all along is that what’s really happening should be the focus, these things happened with the system working as intended” which is still incorrect. Now you’re splitting hairs over inconsequential details based on broad misunderstanding.

And now I'm skeptical about not just your hypothetical objections but about all of them. That's the type of process I find counterproductive.

Nice dismissal of my entire perspective without understanding it. My objections aren’t hypothetical. We know that audio clips are accidentally saved because it happened. We know that Apple knows it happened because they acknowledged it with a formal apology. The intention isn’t the important point. They apologized because they got caught. If they hadn’t gotten caught, their process of capturing audio would have resumed and probably increased as they sought to streamline their services. That’s a reasonable projection.

Is your case here really that I had a point up until I requested we end this interaction? And then suddenly nothing I had said made sense to you anymore? Please.

Anyway, all good with me in the agree to disagree front. Have a nice one yourself.

Sure.

[–] [email protected] 1 points 9 hours ago (2 children)

I don’t think seeing a logical progression or escalation is normalizing current state. It wasn’t, as you put it earlier, “working as intended”. But anyone observing corporate behavior over decades can see that today’s accident or unpopular innovation can be tomorrow’s status quo unless it gets enough pushback.

We haven’t heard about the transgressions that are being committed by corporations right now because they haven’t been caught yet. What’s considered legal is, and we clearly agree on this point, already well beyond the pale.

Everyone should be objecting to violations of privacy, both the ones we can prove and anything hypothetical that could occur. It is not worthless to object preemptively to something that hasn’t happened yet.

If there had been significant, detailed information available about TSA scanners prior to their implementation, for example, the outcry might have halted their use, or at least delayed it. Anyone who described how those work in theoretical terms prior to their implementation would have been labeled “hyperbolic” and “out of touch” prior to the reality of that tech. They’re truly invasive. Anything that’s seemingly out of reach technologically with current solutions could well be around the corner.

Anyway, we’re going in circles. I’ve been trying to end this conversation implicitly without success, so on to explicitly: thank you for the discourse and have a good night/day.

[–] [email protected] 1 points 9 hours ago (4 children)

Why go for the hypothetical future intrusion instead of the current, factual intrusions, you know?

¿Porqué no los dos?

I am the one who brought up the case in the first place because it is truly alarming in and of itself. I’m surprised it doesn’t come up more. It seems to me that the pervasion of voice-activated assistants, like cross-site tracking that led the way to fingerprinting, should be paid more heed, both as a problem now and as a gateway to potentially more egregious violations of privacy later. Don’t doubt that the fears could materialize.

But fair enough! I think we agree far more than we diverge here.

[–] [email protected] 1 points 10 hours ago (6 children)

My reply was addressing what you’d said here:

So we know they paid some money to settle that, but we don't know what was going on (beyond research like the one in the linked article by the OP that says it's unlikely anybody is sending secret voice data).

We do know what was going on. It wasn’t user-end research. A contractor whose job was to determine the efficacy of Siri approached the media because they could tell the audio capture for quite a bit of what they were hearing wasn’t intentional.

To your earlier points, I hope Apple is terrified, and I don’t think that voice activation can be implemented in a way that protects its users from privacy violations.

I don’t know what about my reply led you to believe I am ok with any of this, but to clarify, I am a proponent of strict privacy laws that protect consumers before businesses.

I think “accidents” precede intentional action and I only trust Apple (or any other big tech company) as far as I can throw it.

[–] [email protected] 6 points 11 hours ago (8 children)

Nearly every settlement with a major corporation is settled without the company admitting wrongdoing. I don’t doubt that there was an accidental glitch involved. What confuses me is why that makes it ok to you.

It’s generally a safe bet with cases like this that it would not have made it at far as it did in courts or been as hefty in compensation if the evidence hadn’t been damning.

Here’s the original article in the Guardian that set the whole thing in motion. Apple formally apologized for it.

In other words, we kinda do know what happened. There was a whistleblower on the contractor side.

[–] [email protected] 55 points 16 hours ago (10 children)

People worried about “digital eavesdropping” aren’t paranoid. There’s an entire class-action lawsuit based on Apple’s Siri getting caught being activated without the trigger command and data that was captured being sent to third party providers.

[–] [email protected] 2 points 3 days ago

Regarding this:

would you want previews of content requiring login, perhaps with a risk of accidentally changing related logged in state?

Absolutely not, no. Many platforms have a wait period prior to user-requested account deletion during which logging back in will halt the deletion request.

[–] [email protected] 48 points 3 weeks ago (1 children)

security researchers have repeatedly demonstrated that implementing so-called "lawful" backdoors is inherently flawed as such vulnerabilities would inevitably be discovered, accessed, and exploited by cybercriminals and black-hat hackers.

Yes, that. Every time this comes up, it requires a rehashing of just how dramatically bad it is as a practice. There’s no such thing as a “back door only for the good guys”

Notably, the European Commission makes no mention of new partnership initiatives with the United States.

Probably wise.

[–] [email protected] 8 points 1 month ago

I’d been debating commenting on the unexpected factorial. Thank you for doing the honors.

[–] [email protected] 5 points 1 month ago (1 children)

Unfortunately for those not comfortable purchasing from AliExpress, the system is not yet available on Topton's official online store.

https://www.techpowerup.com/333955/topton-m1-amd-ryzen-3-powered-mini-pc-unveiled-with-tiny-chassis

view more: next ›