it would hold the same meaning as now, which is nothing.
this is automatic writing with a computer. no matter what you train on, you’re using a machine built to produce things that match other things. the machine can’t hold opinions, can’t remember, can’t answer from the training data. all it can do is generate a plausible transcript of a conversation and steer it with input.
one person does not generate enough data during a lifetime so you’re necessarily using aggregated data from millions of people as a base. there’s also no meaning ascribed to anything in the training data. if you give it all a person’s memories, the output conforms to that data like water conforms to a shower nozzle. it’s just a filter on top.
in regards to the final paragraph, i want computers to exhibit as little personhood as possible because i’ve read the transcript of the ELISA experiments. it literally could only figure out subject-verb-object and respond with the same noun as it was fed, and people were saying it should replace psychologists.
The deceased’s sister wrote the script. AI/LLMs didnt write anything. It’s in the article. So the assumptions you made for the middle two paragraphs dont really apply to this specific news article.
No? This is literally a Victim Impact Statement. We see these all the time after the case has determined guilt and before sentencing. This is the opportunity granted to the victims to outline how they feel on the matter.
There have been countless court cases where the victims say things like “I know that my husband would have understood and forgiven [… drone on for a 6 page essay]” or even done this exact thing, but without the “AI” video/audio (home videos with dubbed overlay of a loved one talking about what the deceased person would want/think about it). It’s not abnormal and has been accepted as a way for the aggrieved to voice their wishes to the court. All that’s changed here was the presentation. This didn’t affect the finding of if the person was guilty as it was played after the finding and was only played before sentencing. This is also the customary time where impact statements are made. The “AI” didn’t make the script. This is just a mildly fancier impact statement and that’s it. She could have dubbed it over home video with a fiverr voice actor. Would that change how you feel about it? I see no evidence that the court treated this anything different than any other impact statement. I don’t think anyone would be fooled that the dead person is magically alive and directly making the statement. It’s clear who made it the whole time.
it would hold the same meaning as now, which is nothing.
this is automatic writing with a computer. no matter what you train on, you’re using a machine built to produce things that match other things. the machine can’t hold opinions, can’t remember, can’t answer from the training data. all it can do is generate a plausible transcript of a conversation and steer it with input.
one person does not generate enough data during a lifetime so you’re necessarily using aggregated data from millions of people as a base. there’s also no meaning ascribed to anything in the training data. if you give it all a person’s memories, the output conforms to that data like water conforms to a shower nozzle. it’s just a filter on top.
in regards to the final paragraph, i want computers to exhibit as little personhood as possible because i’ve read the transcript of the ELISA experiments. it literally could only figure out subject-verb-object and respond with the same noun as it was fed, and people were saying it should replace psychologists.
The deceased’s sister wrote the script. AI/LLMs didnt write anything. It’s in the article. So the assumptions you made for the middle two paragraphs dont really apply to this specific news article.
i was responding to the questions posted in the comment i replied to.
also, doesn’t that make this entire thing worse?
No? This is literally a Victim Impact Statement. We see these all the time after the case has determined guilt and before sentencing. This is the opportunity granted to the victims to outline how they feel on the matter.
There have been countless court cases where the victims say things like “I know that my husband would have understood and forgiven [… drone on for a 6 page essay]” or even done this exact thing, but without the “AI” video/audio (home videos with dubbed overlay of a loved one talking about what the deceased person would want/think about it). It’s not abnormal and has been accepted as a way for the aggrieved to voice their wishes to the court. All that’s changed here was the presentation. This didn’t affect the finding of if the person was guilty as it was played after the finding and was only played before sentencing. This is also the customary time where impact statements are made. The “AI” didn’t make the script. This is just a mildly fancier impact statement and that’s it. She could have dubbed it over home video with a fiverr voice actor. Would that change how you feel about it? I see no evidence that the court treated this anything different than any other impact statement. I don’t think anyone would be fooled that the dead person is magically alive and directly making the statement. It’s clear who made it the whole time.