open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz

Preview meta tags from the open.spotify.com website.

Linked Hostnames

1

Thumbnail

Search Engine Appearance

Google

https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz

Synthetic

Listen to this episode from The Digital Human on Spotify. With the rush of generative AI, we have the capacity to create synthetic companions that seem more human than ever before. They can talk in real time, and with enough user input can be moulded into a perfect friend - sharing your interests, build with a custom personality that you enjoy, and always available to talk for a brief chat, or to unleash some 3am anxiety upon, without burdening a real human friend. They have the potential to provide some psychological benefit to people. But, there are concerns. What if the company behind such an AI companion suddenly changed the terms of service, what if your carefully crafted Synthetic Companion wasn't themselves anymore, or stopped responding in a way that met the users needs?This happened in early 2023, when Replika, one of the biggest AI Companion apps decided to ban all adult content, without informing their users. The Big Change, as it came to be known, set the Replika community on fire, and showed how issues of control, expectations and the human propensity to project human attributes onto our machines can come back to bite us.Yet, we should have already known this. Tech developers trying to sell their new shiny product will tell you that it's never been seen before. But we've been using technology to create fake humans to interact with for more than a century.In this episode, Aleks looks to some Synthetic Humans of the past, to understand why people bond so readily with them, and how going forward into a future where we are likely going to have AI Humans all around us, we can insure that they serve our needs and do no harm to the end user.



Bing

Synthetic

https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz

Listen to this episode from The Digital Human on Spotify. With the rush of generative AI, we have the capacity to create synthetic companions that seem more human than ever before. They can talk in real time, and with enough user input can be moulded into a perfect friend - sharing your interests, build with a custom personality that you enjoy, and always available to talk for a brief chat, or to unleash some 3am anxiety upon, without burdening a real human friend. They have the potential to provide some psychological benefit to people. But, there are concerns. What if the company behind such an AI companion suddenly changed the terms of service, what if your carefully crafted Synthetic Companion wasn't themselves anymore, or stopped responding in a way that met the users needs?This happened in early 2023, when Replika, one of the biggest AI Companion apps decided to ban all adult content, without informing their users. The Big Change, as it came to be known, set the Replika community on fire, and showed how issues of control, expectations and the human propensity to project human attributes onto our machines can come back to bite us.Yet, we should have already known this. Tech developers trying to sell their new shiny product will tell you that it's never been seen before. But we've been using technology to create fake humans to interact with for more than a century.In this episode, Aleks looks to some Synthetic Humans of the past, to understand why people bond so readily with them, and how going forward into a future where we are likely going to have AI Humans all around us, we can insure that they serve our needs and do no harm to the end user.



DuckDuckGo

https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz

Synthetic

Listen to this episode from The Digital Human on Spotify. With the rush of generative AI, we have the capacity to create synthetic companions that seem more human than ever before. They can talk in real time, and with enough user input can be moulded into a perfect friend - sharing your interests, build with a custom personality that you enjoy, and always available to talk for a brief chat, or to unleash some 3am anxiety upon, without burdening a real human friend. They have the potential to provide some psychological benefit to people. But, there are concerns. What if the company behind such an AI companion suddenly changed the terms of service, what if your carefully crafted Synthetic Companion wasn't themselves anymore, or stopped responding in a way that met the users needs?This happened in early 2023, when Replika, one of the biggest AI Companion apps decided to ban all adult content, without informing their users. The Big Change, as it came to be known, set the Replika community on fire, and showed how issues of control, expectations and the human propensity to project human attributes onto our machines can come back to bite us.Yet, we should have already known this. Tech developers trying to sell their new shiny product will tell you that it's never been seen before. But we've been using technology to create fake humans to interact with for more than a century.In this episode, Aleks looks to some Synthetic Humans of the past, to understand why people bond so readily with them, and how going forward into a future where we are likely going to have AI Humans all around us, we can insure that they serve our needs and do no harm to the end user.

  • General Meta Tags

    15
    • title
      Synthetic - The Digital Human | Podcast on Spotify
    • charset
      utf-8
    • fb:app_id
      174829003346
    • X-UA-Compatible
      IE=9
    • viewport
      width=device-width, initial-scale=1
  • Open Graph Meta Tags

    177
    • og:site_name
      Spotify
    • og:title
      Synthetic
    • og:description
      The Digital Human · Episode
    • og:url
      https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz
    • og:type
      music.song
  • Twitter Meta Tags

    5
    • twitter:site
      @spotify
    • twitter:title
      Synthetic
    • twitter:description
      The Digital Human · Episode
    • twitter:image
      https://i.scdn.co/image/ab6765630000ba8a6073eeaa3d49d9aa6358c422
    • twitter:card
      summary
  • Link Tags

    31
    • alternate
      https://open.spotify.com/oembed?url=https%3A%2F%2Fopen.spotify.com%2Fepisode%2F1PyiN4cJj9eYmN59MOd9pz
    • alternate
      android-app://com.spotify.music/spotify/episode/1PyiN4cJj9eYmN59MOd9pz
    • canonical
      https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz
    • icon
      https://open.spotifycdn.com/cdn/images/favicon32.b64ecc03.png
    • icon
      https://open.spotifycdn.com/cdn/images/favicon16.1c487bff.png
  • Website Locales

    2
    • EN country flagen
      https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz
    • DEFAULT country flagx-default
      https://open.spotify.com/episode/1PyiN4cJj9eYmN59MOd9pz

Links

7