Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We are updating our Terms of Service for all Unity subscription plans, effective October 13, 2022, to create a more streamlined, user-friendly set of terms. Please review them here: unity.com/legal/terms-of-service.
    Dismiss Notice
  3. Have a look at our Games Focus blog post series which will show what Unity is doing for all game developers – now, next year, and in the future.
    Dismiss Notice
  4. Join us on Thursday, September 29, for a day with Unity's SRP teams here on the forum or on Reddit, and discuss topics around URP, HDRP, and the Scriptable Render Pipeline in general.
    Dismiss Notice

Unity Chatbot Asset - Approach To Simulate Human Being

Discussion in 'Assets and Asset Store' started by SkandYxyz, Jan 30, 2017.

  1. Chapmania_Design

    Chapmania_Design

    Joined:
    Jan 27, 2017
    Posts:
    8
    Is it possible to create a conversational 3d talking character using this asset? Similar to a Voice Recognition/TTS/SST system?
     
  2. Chapmania_Design

    Chapmania_Design

    Joined:
    Jan 27, 2017
    Posts:
    8
    Can it parse the text to a lip-syncing system?
     
  3. HackerSam

    HackerSam

    Joined:
    Aug 12, 2020
    Posts:
    3
    how to add voice interaction to this?
     
  4. BryanO

    BryanO

    Joined:
    Jan 8, 2014
    Posts:
    186
    - it appears theres a difference in the files that compile for android and
    the files that are used for a windows build.....

    What are those differences? how do we change dialog for each scenario? Please offer step by step guidance.
     
    Anpu23 likes this.
  5. Anpu23

    Anpu23

    Joined:
    Sep 4, 2017
    Posts:
    14
    I did this, it wasn't easy but in essence I changed the input and output strings from private to public, disabled the input button so it read the input string without prompting. Used a voice to text editor and used the output as the input in the Chatbot. took the output and injected it into a text to voice converter, animated a ragdoll with the voice and then cleared the input line before it went back into the chatbot (as the input heard the voice generation). It works and I am using it to present for my company for million cups. I'll be posting video when I do it on the Lytchgate YouTube channel.
     
  6. Anpu23

    Anpu23

    Joined:
    Sep 4, 2017
    Posts:
    14
    My question is this, why is the chatbot working seamlessly in editor but the moment I go to build it simply isn't working? I've built in both WebGL and in Windows. The only lines of code I changed was to change the input and output texts from public to private and disabled the input button so when the input line changes it executes rather then waiting for the button.

    The reason for this is I added voice control and vocal response. I'm using a ragdoll animated using salsa Lipsync instead of the default animation asset again it shouldn't matter, but I am not seeing any response at all from chatbot I can see the input working from the voice and the system responds to other input (I also added other vocal commands to help navigate) but Chatbot is simply not responding as if it's waiting for input. No error logs, what am I missing?
     
  7. Anpu23

    Anpu23

    Joined:
    Sep 4, 2017
    Posts:
    14
    Yes I have accomplished this
     
unityunity