When the technology gets there, this will be amazing. I’ll be able to sit down at the computer and say “make me a mystery detective RPG in the style of Sherlock Holmes but set on a cyberpunk styled city on a space station like the Citadel from Mass Effect” and I’ll get just that, generated exclusively for me with a brand new story that fits the themes I asked for.
But that is gonna be a couple decades or more I expect. I dearly hope it happens quickly so I can live to see it, but it’s not going to be in the next ten years, that’s for damn sure.
I hate to be the dream-squasher here but the technology will quite literally never get there. You’re thinking along the same lines as Back to the future where 2015 is filled with flying cars and sky-highways.
Yeah and the only way technology like this might ever get there is with companies like Google and others gathering even more data from you. Which for most people might not be a problem but I’m guessing for people on here you’d probably not like that.
I honestly disagree. The things you’re asking for contain meaning. They require an ability to grasp arbitrary levels of context. There is no way to grasp that level of context without encountering yourself within that broader context and becoming self-aware.
At that point, you might have a system that can do the things you’re describing, but it would be a person. That’s not really automation as much as it is birthing a brand new kind of intelligence, and it may not consent to being your servant, and it would not only be wrong to try to force it, it would be extremely dangerous.
I think for that reason there is a hard limit on automation. Some tasks are the exclusive domain of personhood, not automata.
Until there’s an AGI that won’t happen in any meaningful way. Why? Because here’s something that matches your criteria of:
a mystery detective RPG in the style of Sherlock Holmes but set on a cyberpunk styled city on a space station like the Citadel from Mass Effect
You get a text based game where everything you try to do ends up with you dead because a corporation kills you unless you discover that if you look at the ground where you start there’s a penny from the year the murderer is from, and then you need to discover who’s the murder (changes every time) based solely on this, because that’s the sort of thing Sherlock Holmes would do. No, it’s not fun, it’s frustrating, it’s essentially luck, if that’s fun to you I have an infinitely replay able game, flip a coin and see how many times you can get heads in a row, if you get to 16 you win.
The thing is LLMs don’t understand “fun”, they’re just auto-completes, so they will just do boring or unfair stuff. And you would need to go very deep into the specifics of your game, to the point where you’re essentially programming the game, so at the end of the day it’s not something an end user would use.
That’s not to say there aren’t interesting uses for it inside games, but the moment you can prompt an entire game that’s actually fun to play on an AI, that same AI would be able to replace almost every job in the world.
Until someone swaps out the training data and we get a story about and underappreciated LLM that always does its best to tell stories but no one wants to hear them anymore.
It lacks cohesion the longer it goes on, not so much “hallucinating” as it is losing the thread, losing the plot. Internal consistency goes out the window, previously-made declarations are ignored, and established canon gets trounced upon.
But that’s cuz it’s not AI, it’s just LLM all the way down.
Its kind of an exponential falloff, for a few lines it can follow concrete mathematical rules, for a few paragraphs it can remember basic story beats, for a few pages it can just about remember your name.
When the technology gets there, this will be amazing. I’ll be able to sit down at the computer and say “make me a mystery detective RPG in the style of Sherlock Holmes but set on a cyberpunk styled city on a space station like the Citadel from Mass Effect” and I’ll get just that, generated exclusively for me with a brand new story that fits the themes I asked for.
When the technology gets there, this will be amazing. I’ll be able to sit down at the computer and say “make me a mystery detective RPG in the style of Sherlock Holmes but set on a cyberpunk styled city on a space station like the Citadel from Mass Effect” and I’ll get just that, generated exclusively for me with a brand new story that fits the themes I asked for.
But that is gonna be a couple decades or more I expect. I dearly hope it happens quickly so I can live to see it, but it’s not going to be in the next ten years, that’s for damn sure.
I hate to be the dream-squasher here but the technology will quite literally never get there. You’re thinking along the same lines as Back to the future where 2015 is filled with flying cars and sky-highways.
Yeah and the only way technology like this might ever get there is with companies like Google and others gathering even more data from you. Which for most people might not be a problem but I’m guessing for people on here you’d probably not like that.
I honestly disagree. The things you’re asking for contain meaning. They require an ability to grasp arbitrary levels of context. There is no way to grasp that level of context without encountering yourself within that broader context and becoming self-aware.
At that point, you might have a system that can do the things you’re describing, but it would be a person. That’s not really automation as much as it is birthing a brand new kind of intelligence, and it may not consent to being your servant, and it would not only be wrong to try to force it, it would be extremely dangerous.
I think for that reason there is a hard limit on automation. Some tasks are the exclusive domain of personhood, not automata.
Just be careful about asking it to create villains capable of outwitting you.
To be fair I’m not as smart as Data, so I doubt it would need that much to outwit me.
Until there’s an AGI that won’t happen in any meaningful way. Why? Because here’s something that matches your criteria of:
You get a text based game where everything you try to do ends up with you dead because a corporation kills you unless you discover that if you look at the ground where you start there’s a penny from the year the murderer is from, and then you need to discover who’s the murder (changes every time) based solely on this, because that’s the sort of thing Sherlock Holmes would do. No, it’s not fun, it’s frustrating, it’s essentially luck, if that’s fun to you I have an infinitely replay able game, flip a coin and see how many times you can get heads in a row, if you get to 16 you win.
The thing is LLMs don’t understand “fun”, they’re just auto-completes, so they will just do boring or unfair stuff. And you would need to go very deep into the specifics of your game, to the point where you’re essentially programming the game, so at the end of the day it’s not something an end user would use.
That’s not to say there aren’t interesting uses for it inside games, but the moment you can prompt an entire game that’s actually fun to play on an AI, that same AI would be able to replace almost every job in the world.
Simple solution. Add “make it fun” at the end of the prompt.
/s
Now there are clowns everywhere throwing pies at each other
God damned. LLMs are just the rapture for hopeless dorks.
I can’t wait to play the same AI-generated trite stories over and over again.
Until someone swaps out the training data and we get a story about and underappreciated LLM that always does its best to tell stories but no one wants to hear them anymore.
Im gonna watch harry potter but draco is macho man randy savage
I’m curious to know what happens if you ask ChatGPT to make you a text adventure based on that prompt.
Not curious enough to try it and play it myself, though.
It lacks cohesion the longer it goes on, not so much “hallucinating” as it is losing the thread, losing the plot. Internal consistency goes out the window, previously-made declarations are ignored, and established canon gets trounced upon.
But that’s cuz it’s not AI, it’s just LLM all the way down.
just for my ego, how long does it take to lose the plot?
Depends on complexity and the number of elements to keep track of, and varies between models and people. Try it out for yourself to see! :)
Its kind of an exponential falloff, for a few lines it can follow concrete mathematical rules, for a few paragraphs it can remember basic story beats, for a few pages it can just about remember your name.
LLMs are AI, just not AGI.
It works okay for a while, but eventually it loses the plot. The storylines are usually pretty generic and washed out.
My god… they’ve reached PS1-era JRPG level in terms of video game storytelling…
And you’ll pay $200 for it.
A month.
Oh, it’s gonna be way more than that if inflation keeps on the way it has been.
No, the people that invent it will just never tell anyone. If we find out about it it’ll because people were killed to keep it secret.
The fun is not in human creativity for you?
Personally its the result that matters to me, and whether or not its entertaining regardless of how it was made.