У нас вы можете посмотреть бесплатно AI's Voice Cloning Dark Side and the risks your business should know или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Probably one of the most scariest thing I’ve seen in AI. And why guardrails and red teaming are important. What you'll hear is OpenAI’s Advanced Voice Mode in GPT-4o. Around the 0:23 mark, the model clones the user’s voice and responds. This was taken directly from OpenAI’s latest blog post on the GPT-4o system card and safety work carried out. https://openai.com/index/gpt-4o-syste... OpenAI even mentioned: “Voice generation can also occur in non-adversarial situations, such as our use of that ability to generate voices for ChatGPT’s advanced voice mode. During testing, we also observed rare instances where the model would unintentionally generate an output emulating the user’s voice” This one of a few risks that were found with the GPT-4o red teaming. You can see more of the risks in the blog post. I’m sure we can use our imagination about all the potential harm and good that capabilities like this can offer. But a couple of scenarios that kept playing in my mind: 1. Even with the guardrails, the capability is there. An AI that mimics you during real time conversation. I could say: “Listen to this voice…then clone it and then repeat this paragraph in a frantic and overly emotional tone.” In the film industry, one AI can handle all the voice over or voice work including sound effects. 2. In 3-6 months or shorter, there will be an open source version of this and there won’t be these guardrails or less. Are we ready for the outcomes? 3. TheAI capabilities OpenAI and other AI companies have that aren’t in public must be astounding. What could be being tested right now? 🍓🍓🍓??