Search Unity

Unity Chatbot Asset - Approach To Simulate Human Being

Discussion in 'Assets and Asset Store' started by SkandYxyz, Jan 30, 2017.

  1. Chapmania_Design

    Chapmania_Design

    Joined:
    Jan 27, 2017
    Posts:
    8
    Is it possible to create a conversational 3d talking character using this asset? Similar to a Voice Recognition/TTS/SST system?
     
  2. Chapmania_Design

    Chapmania_Design

    Joined:
    Jan 27, 2017
    Posts:
    8
    Can it parse the text to a lip-syncing system?
     
  3. SamIsHere006

    SamIsHere006

    Joined:
    Aug 12, 2020
    Posts:
    3
    how to add voice interaction to this?
     
  4. BryanO

    BryanO

    Joined:
    Jan 8, 2014
    Posts:
    186
    - it appears theres a difference in the files that compile for android and
    the files that are used for a windows build.....

    What are those differences? how do we change dialog for each scenario? Please offer step by step guidance.
     
    Anpu23 likes this.
  5. Anpu23

    Anpu23

    Joined:
    Sep 4, 2017
    Posts:
    16
    I did this, it wasn't easy but in essence I changed the input and output strings from private to public, disabled the input button so it read the input string without prompting. Used a voice to text editor and used the output as the input in the Chatbot. took the output and injected it into a text to voice converter, animated a ragdoll with the voice and then cleared the input line before it went back into the chatbot (as the input heard the voice generation). It works and I am using it to present for my company for million cups. I'll be posting video when I do it on the Lytchgate YouTube channel.
     
  6. Anpu23

    Anpu23

    Joined:
    Sep 4, 2017
    Posts:
    16
    My question is this, why is the chatbot working seamlessly in editor but the moment I go to build it simply isn't working? I've built in both WebGL and in Windows. The only lines of code I changed was to change the input and output texts from public to private and disabled the input button so when the input line changes it executes rather then waiting for the button.

    The reason for this is I added voice control and vocal response. I'm using a ragdoll animated using salsa Lipsync instead of the default animation asset again it shouldn't matter, but I am not seeing any response at all from chatbot I can see the input working from the voice and the system responds to other input (I also added other vocal commands to help navigate) but Chatbot is simply not responding as if it's waiting for input. No error logs, what am I missing?
     
  7. Anpu23

    Anpu23

    Joined:
    Sep 4, 2017
    Posts:
    16
    Yes I have accomplished this
     
  8. kush24

    kush24

    Joined:
    Jan 30, 2020
    Posts:
    2
    Can you please help me out in understanding how you did? I need to make something similar.