"Your honor, the evidence shows quite clearly that the defendent was holding a weapon with his third arm."
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
If you ever encountered an AI hallucinating stuff that just does not exist at all, you know how bad the idea of AI enhanced evidence actually is.
No computer algorithm can accurately reconstruct data that was never there in the first place.
Ever.
This is an ironclad law, just like the speed of light and the acceleration of gravity. No new technology, no clever tricks, no buzzwords, no software will ever be able to do this.
Ever.
If the data was not there, anything created to fill it in is by its very nature not actually reality. This includes digital zoom, pixel interpolation, movement interpolation, and AI upscaling. It preemptively also includes any other future technology that aims to try the same thing, regardless of what it's called.
One little correction, digital zoom is not something that belongs on that list. It’s essentially just cropping the image. That said, “enhanced” digital zoom I agree should be on that list.
Digital zoom is just cropping and enlarging. You're not actually changing any of the data. There may be enhancement applied to the enlarged image afterwards but that's a separate process.
But the fact remains that digital zoom cannot create details that were invisible in the first place due to the distance from the camera to the subject. Modern implementations of digital zoom always use some manner of interpolation algorithm, even if it's just a simple linear blur from one pixel to the next.
The problem is not in how a digital zoom works, it's on how people think it works but doesn't. A lot of people (i.e. [l]users, ordinary non-technical people) still labor under the impression that digital zoom somehow makes the picture "closer" to the subject and can enlarge or reveal details that were not detectable in the original photo, which is a notion we need to excise from people's heads.
Hold up. Digital zoom is, in all the cases I'm currently aware of, just cropping the available data. That's not reconstruction, it's just losing data.
Otherwise, yep, I'm with you there.
I think we need to STOP calling it "Artificial Intelligence". IMHO that is a VERY misleading name. I do not consider guided pattern recognition to be intelligence.
A term created in order to vacuum up VC funding for spurious use cases.
Optical Character Recognition used to be firmly in the realm of AI until it became so common that even the post office uses it. Nowadays, OCR is so common that instead of being proper AI, it’s just another mundane application of a neural network. I guess, eventually Large Language Models will be outside there scope of AI.
How long until we got upscalers of various sorts built into tech that shouldn't have it? For bandwidth reduction, for storage compression, or cost savings. Can we trust what we capture with a digital camera, when companies replace a low quality image of the moon with a professionally taken picture, at capture time? Can sport replays be trusted when the ball is upscaled inside the judges' screens? Cheap security cams with "enhanced night vision" might get somebody jailed.
I love the AI tech. But its future worries me.
It will wild out for the foreseeable future until the masses stop falling for it in gimmicks then it will be reserved for the actual use cases where it's beneficial once the bullshit ai stops making money.
Lol, you think the masses will stop falling for it in gimmicks? Just look at the state of the world.
AI-based video codecs are on the way. This isn't necessarily a bad thing because it could be designed to be lossless or at least less lossy than modern codecs. But compression artifacts will likely be harder to identify as such. That's a good thing for film and TV, but a bad thing for, say, security cameras.
The devil's in the details and "AI" is way too broad a term. There are a lot of ways this could be implemented.
I don't think loss is what people are worried about, really - more injecting details that fit the training data but don't exist in the source.
Given the hoopla Hollywood and directors made about frame-interpolation, do you think generated frames will be any better/more popular?
Jesus Christ, does this even need to be pointed out!??
Unfortunately it does need pointing out. Back when I was in college, professors would need to repeatedly tell their students that the real world forensics don't work like they do on NCIS. I'm not sure as to how much thing may or may not have changed since then, but based on American literacy levels being what they are, I do not suppose things have changed that much.
Yes. When people were in full conspiracy mode on Twitter over Kate Middleton, someone took that grainy pic of her in a car and used AI to “enhance it,” to declare it wasn’t her because her mole was gone. It got so much traction people thought the ai fixed up pic WAS her.
I met a student at university last week at lunch who told me he is stressed out about some homework assignment. He told me that he needs to write a report with a minimum number of words so he pasted the text into chatGPT and asked it about the number of words in the text.
I told him that every common text editor has a word count built in and that chatGPT is probably not good at counting words (even though it pretends to be good at it)
Turns out that his report was already waaaaay above the minimum word count and even needed to be shortened.
So much about the understanding of AI in the general population.
I'm studying at a technical university.
The layman is very stupid. They hear all the fantastical shit AI can do and they start to assume its almighty. Thats how you wind up with those lawyers that tried using chat GPT to write up a legal brief that was full of bullshit and didnt even bother to verify if it was accurate.
They dont understand it, they only know that the results look good.
Of course, not everyone is technology literate enough to understand how it works.
That should be the default assumption, that something should be explained so that others understand it and can make better, informed, decisions. .
I’d love to see the “training data” for this model, but I can already predict it will be 99.999% footage of minorities labelled ‘criminal’.
And cops going “Aha! Even AI thinks minorities are committing all the crime”!
Imagine a prosecution or law enforcement bureau that has trained an AI from scratch on specific stimuli to enhance and clarify grainy images. Even if they all were totally on the up-and-up (they aren't, ACAB), training a generative AI or similar on pictures of guns, drugs, masks, etc for years will lead to internal bias. And since AI makers pretend you can't decipher the logic (I've literally seen compositional/generative AI that shows its work), they'll never realize what it's actually doing.
So then you get innocent CCTV footage this AI "clarifies" and pattern-matches every dark blurb into a gun. Black iPhone? Maybe a pistol. Black umbrella folded up at a weird angle? Clearly a rifle. And so on. I'm sure everyone else can think of far more frightening ideas like auto-completing a face based on previously searched ones or just plain-old institutional racism bias.
According to the evidence, the defendant clearly committed the crime with all 17 of his fingers. His lack of remorse is obvious by the fact that he's clearly smiling wider than his own face.
clickity clackity
"ENHANCE"
AI enhanced = made up.
It's incredibly obvious when you call the current generation of AI by its full name, generative AI. It's creating data, that's what it's generating.
Everything that is labeled "AI" is made up. It's all just statistically probable guessing, made by a machine that doesn't know what it is doing.
The fact that it made it that far is really scary.
I'm starting to think that yes, we are going to have some new middle ages before going on with all that "per aspera ad astra" space colonization stuff.
Aren't we already in a kind of dark age?
People denying science, people scared of diseases and vaccination, people using anything AI or blockchain as if it were magic, people defending power-hungry, all-promising dictators, people divided over and calling the other side barbaric. And of course, wars based on religion.
Seems to me we're already in the dark.
Why not make it a fully AI court and save time if they were going to go that way. It would save so much time and money.
Of course it wouldn't be very just, but then regular courts aren't either.
This actually opens an interesting debate.
Every photo you take with your phone is post processed. Saturation can be boosted, light levels adjusted, noise removed, night mode, all without you being privy as to what's happening.
Typically people are okay with it because it makes for a better photo - but is it a true representation of the reality it tried to capture? Where is the line of the definition of an ai-enhanced photo/video?
We can currently make the judgement call that a phones camera is still a fair representation of the truth, but what about when the 4k AI-Powered Night Sight Camera does the same?
My post is more tangentially related to original article, but I'm still curious as what the common consensus is.
You'd think it would be obvious you can't submit doctored evidence and expect it to be upheld in court.
For example, there was a widespread conspiracy theory that Chris Rock was wearing some kind of face pad when he was slapped by Will Smith at the Academy Awards in 2022. The theory started because people started running screenshots of the slap through image upscalers, believing they could get a better look at what was happening.
Sometimes I think, our ancestors shouldn’t have made it out of the ocean.