Artificial Intelligence

Out of context: Reply #1795

  • Started
  • Last post
  • 2,577 Responses
  • monNom5

    Nvidia releases an open source robot ai platform GR00T
    https://github.com/NVIDIA/Isaac-…

    @10:00


    • neat architecture with a vision/llm for 'thinking', and a diffusion model for action/motor control.monNom
    • Seeing Jensen on the stage presenting like this...I just miss Steve's keynotes man
      ********
    • 10:55
      ********
    • the robotics hype bubble pivot is here... i just want my smartphone battery to last over 8 hours... is that too much of a hurdle for these tech geniuses?jonny_quest_lives
    • Indeed.
      ********
    • If you can fine-tune a robot for a task like you can fine-tune an LLM or diffuser, (IE cheaply), I can see these things becoming useful very quickly.monNom
    • task specific robots has never been a hurdlejonny_quest_lives
    • love that these little robots look like something you'd see in starwars.
      take my money!
      microkorg
    • because disney imagineering did the heavy lift... https://www.youtube.…jonny_quest_lives
    • https://la.disneyres…jonny_quest_lives
    • I was thinking a general purpose robot that could be cheaply retrained to get good at specific tasks. Redeployable. Not purpose built.monNom
    • Time to finally boot up my 3D printer and start building out my CHAPPiE unitprophetone
    • One day in like 10 years, you will be able to Amazon order a humanoid robot to your door, turn key ready to go is where we’re headedprophetone
    • Imagine a future where criminals use robots trained in thievery etc to burglarize homes on their behalf...prophetone
    • A future where because of this the right to bear arms will mean home protection robots marketed to worried homeownersprophetone
    • Average joe can already live in Black Mirror and have a Metalhead dog shipped to your door overnight https://youtu.be/Btt…prophetone
    • You can buy a Chinese humanoid robot today. https://www.robotsho…monNom
    • if you read the disney white paper it's still just a puppet that utilizes reinforcement learning but still utilizes an operatorjonny_quest_lives
    • presumably those controls are the 'action tokens' shown here?: https://raw.githubus…monNom
    • My impression is that the robot's movement is going to take input from the visual, text(verbal), kinematics, and then special action tokens.monNom
    • It probably shouldn't be moving unless you give it a command/objective.monNom
    • But yes, no doubt the emotive performance at the nvidia presentation was nearly 100% operator and fine-tuned cuteness.monNom
    • It doesn't seem a stretch that the robot could generate those action tokens itself given the right verbal/visual stimulus.monNom
    • This thing is cute AF!ideaist

View thread