A difficult situation, to say the least. Kiko is too young to remain single forever but she’ll need a few years of mourning before she starts dating again.
The anniversary of my daughter’s untimely death approaches. I’ve remained friendly with the young man she was living with at the time of her passing. We usually connect around significant anniversary dates like this.
It took four years before he started dating again after my daughter’s passing.
Mourning is not a linear process and I sometimes wonder if it makes things more difficult for him by continuing to chat him up. He seems a decent young man although his passion for skydiving continues; a passion he shared with my daughter whose massive injuries from skydiving accident led to her eventual suicide.
I was just reading a couple of articles, apparently AI has learned to lie…I’d have thought that would be the first thing it’s programmers taught it, seeing who they work for…but it can also be fooled by two marines with a cardboard box so there’s that.
And I see that working for the same reason the lying does…as the man said there, Garbage In Garbage Out.
What he does not account for, is the garbage input and the programmers themselves become unnecessary and irrelevant, as the machines themselves, as part of master learning program, learn to program their ownselves. And that’s the danger that Musk and others are foretelling and warning against.
No, ChatGPT-style AI’s cannot lie for the same reason they cannot tell the truth – they manipulate words and phrases without understanding them. They are the verbal equivalent of the picture-manipulating AI’s that draw hands with 6 fingers and everyone from the Pope to Vikings as black. (The fingers are an accident because the AI knows nothing of the real world, but turning historical figures to Africans must have been programmed – I’d be surprised if Google did not similarly program bias into ChatGPT, but it was done more subtly.)
8 Comments
Karl, led Zed and all truly laconic heroes, has mastered the fact that Silence is Golden.
And duct tape is silver
A difficult situation, to say the least. Kiko is too young to remain single forever but she’ll need a few years of mourning before she starts dating again.
The anniversary of my daughter’s untimely death approaches. I’ve remained friendly with the young man she was living with at the time of her passing. We usually connect around significant anniversary dates like this.
It took four years before he started dating again after my daughter’s passing.
Mourning is not a linear process and I sometimes wonder if it makes things more difficult for him by continuing to chat him up. He seems a decent young man although his passion for skydiving continues; a passion he shared with my daughter whose massive injuries from skydiving accident led to her eventual suicide.
Doubly difficult, as she and Jack Jr. need a husband/father.
Very true, for both their sake.
I was just reading a couple of articles, apparently AI has learned to lie…I’d have thought that would be the first thing it’s programmers taught it, seeing who they work for…but it can also be fooled by two marines with a cardboard box so there’s that.
https://redstate.com/wardclark/2024/08/12/ai-may-be-learning-to-deceive-humans-but-ai-isnt-counting-on-marines-n2178023
Heh. Good one of the Marines in a box.
And I see that working for the same reason the lying does…as the man said there, Garbage In Garbage Out.
What he does not account for, is the garbage input and the programmers themselves become unnecessary and irrelevant, as the machines themselves, as part of master learning program, learn to program their ownselves. And that’s the danger that Musk and others are foretelling and warning against.
No, ChatGPT-style AI’s cannot lie for the same reason they cannot tell the truth – they manipulate words and phrases without understanding them. They are the verbal equivalent of the picture-manipulating AI’s that draw hands with 6 fingers and everyone from the Pope to Vikings as black. (The fingers are an accident because the AI knows nothing of the real world, but turning historical figures to Africans must have been programmed – I’d be surprised if Google did not similarly program bias into ChatGPT, but it was done more subtly.)