All Discussions > Steam Forums > Off Topic > Topic Details
Dumb question but… (Part 9)
What happens if someone use AI to make fake evidence for a court case (criminal or civil cases)? Will there be a way to counter this? How would they even know if it’s genuine? What about deepfakes? Fake recorded convos?

Disclaimer: I already know making fake evidence is illegal but what if AI gets so good you can’t tell if it’s real or fake?
Last edited by Xero_Daxter; 8 hours ago
Originally posted by Santa Klaus:
I think it would come down to the sourcing of the supposed evidence, if someone commits a crime in public they may well be recorded by multiple cameras from multiple angles, even if one camera purportedly showed a crime it could be proven to be false by a lack of corroboration from other cameras in the area recording at the same time.

Also AI companies will presumably store generated material long term so that any supposed evidence can be checked against AI company records.

I suspect law enforcement will ultimately have to use AI to scan if something has been made by AI and I assume that on a pixel level there would still be some inconsistencies that would occur from it having been generated and not recorded.

If it's impossible to determine if image/video evidence is real or not it'll simply be like how a large amount of evidence is already treated - as possibly ambiguous and suffering from reasonable doubt.

For example - a blood spot on someone's carpet from a murder victim.
Is it because the accused murdered them and failed to clean up 100% of the evidence?
Or is it because the murder victim was their friend and suffered a nosebleed a year earlier and it wasn't bleached out?

On a jury you have to weigh the balance of probability with all evidence today, the future will be pretty much the same.
< >
Showing 1-11 of 11 comments
Nobody knows. Likely it’s perceived as real evidence. Otherwise dismissed for lack of corroboration; but lack of corroboration never stopped evidence before so that’s unlikely.

This is part of the issue with of some of these tools.
oh my
The reason why this topic opened up is when I was making AI art of myself. I was thinking “Why don’t people ever use AI to create fake evidence and win their criminal case?” Will there ever be a way for anyone to find out?
Analyse the evidence with AI ofc.
Pretty sure there are ways to check if any evidence is AI generated.

If humans got so incompetent that AI can trick the court, I guess vigilante or hire assassin.
Originally posted by Goldias:
Pretty sure there are ways to check if any evidence is AI generated.

If humans got so incompetent that AI can trick the court, I guess vigilante or hire assassin.
I mean… I suppose. Haha. But still. People get away with AI stuff a handful of times. Like back in college someone used AI to do his homework for him. How I found out? He told me after graduation.
The author of this thread has indicated that this post answers the original topic.
I think it would come down to the sourcing of the supposed evidence, if someone commits a crime in public they may well be recorded by multiple cameras from multiple angles, even if one camera purportedly showed a crime it could be proven to be false by a lack of corroboration from other cameras in the area recording at the same time.

Also AI companies will presumably store generated material long term so that any supposed evidence can be checked against AI company records.

I suspect law enforcement will ultimately have to use AI to scan if something has been made by AI and I assume that on a pixel level there would still be some inconsistencies that would occur from it having been generated and not recorded.

If it's impossible to determine if image/video evidence is real or not it'll simply be like how a large amount of evidence is already treated - as possibly ambiguous and suffering from reasonable doubt.

For example - a blood spot on someone's carpet from a murder victim.
Is it because the accused murdered them and failed to clean up 100% of the evidence?
Or is it because the murder victim was their friend and suffered a nosebleed a year earlier and it wasn't bleached out?

On a jury you have to weigh the balance of probability with all evidence today, the future will be pretty much the same.
Last edited by Santa Klaus; 8 hours ago
suspicions of tampering or misinformations are always in mind anyways
I have been thinking about this question for couple of years, and still don't know the answer. I think it's already too late to believe audio and video "evidence".

We all know law enforcement is corrupt.
Last edited by Abaddon the Despoiler; 8 hours ago
Originally posted by Abaddon the Despoiler:
I have been thinking about this question for couple of years, and still don't know the answer. I think it's already too late to believe audio and video "evidence".

We all know law enforcement is corrupt.
Got me thinking what if police use AI to fake body cam footage? Well crap. We’re screwed.
Watch Steve Lehto.

He has done videos on lawyers getting caught being sending in bogus AI generated paperwork already.

It's already happening.
< >
Showing 1-11 of 11 comments
Per page: 1530 50

All Discussions > Steam Forums > Off Topic > Topic Details