Mining Robots

From RepRap
Jump to: navigation, search

A truly self-replicating machine needs to acquire its own materials to build things. This is "mining":

  • In a developed urban environment, enormous quantities of discarded materials are available, including valuable recyclable plastics, but the various plastics are mixed together with each other, and with materials like glass, plastic, and food waste. We need fully automated systems to handle, sort, clean, and process these materials, including upcycling plastic waste to printable filament.
  • Enormous quantities of rocky regolith are available on the surface of almost every planetary body, including Earth's deserts. There are often large deposits of fine powdery rock, most commonly basalt dust, which is roughly uniform in composition throughout the solar system, mostly consisting of oxides of various common metals (aluminum, iron, calcium, magnesium).

NASA's Robotic Mining Competition challenges university teams to build a fully autonomous robot to extract rocky regolith from a planetary surface, and deposit it in a hopper for further processing.

See also the Robot page here for mobility and electronics choices.

Automated Material Handling

A robot can move material using:

  • A scoop on an arm requires high torque.
  • A bucket ladder has higher mass transport rates, but is heavier, and can jam up, especially when processing rocks.
  • A screw drive requires high torque, but can penetrate deeply.


Automated Control Architecture

A fully automated robot needs to make good decisions about how it moves, and what it processes.

Object Recognition

Planetary surfaces include obstacles such as rocks and craters, which a robot must avoid.

Urban environments contain more complex obstacles such as people, animals, and other vehicles.

Tools for object recognition include:

  • 2D cameras. The computer vision problem is still unsolved, but object recognition neural networks are improving rapidly.
  • Plane LIDAR use a spinning head and laser light to measure distances in 360 degrees along a single plane. This ranges from a Neato XV-11 type laser parallax distance sensor to full scanning LIDAR starting at thousands of dollars. Scanning is required to sweep out a full 3D dataset.
  • Multibeam scanning LIDAR uses several beams to provide a full 3D image, at a substantial hardware cost.
  • Infrared structured light sensors like the Kinect-360 or Intel RealSense use stereo vision augmented by an infrared projector. The Intel RealSense can smoothly transition to pure stereo operation in bright sunlight, which washes out the active emitter.


Localization

For a robot to plan a path to its goal, it needs to know the goal's location and its own location.

  • Outdoor robots on Earth can use GPS to get their location within a few meters, or differential GPS to get within a few centimeters. GPS degrades in urban environments, and is nonexistent on other solar system bodies, underground, or indoors.
  • Localization beacons, like the HTC Vive lighthouse, normally use optical or radio broadcasts to allow positioning.
  • Vision-based systems use naturally occurring landmarks or artificial computer vision markers like AprilTags or Aruco markers to allow the robot to localize itself. In most of these systems, pixel precision provides a constant angular position error, but depth precision degrades at a linear rate with distance, and the target's angular orientation precision degrades at a quadratic rate. Cameras also have substantial latency, normally at least a few frames, as well as motion blur.


Path Planning

The closest analog in 3D printing is slicing; and the analog in CNC machining is toolpath planning.

For mobile robots, path planning is typically done online, in (near) real time. ROS incorporates a global_planner package using A* or Djikstra for global search, but the navigation package incorporates a number of improvements, such as local planners for recovery. Raw A* is expensive, and can produce hunting if run continually in an online fashion.

Source code from the NASA Robotic Mining Competition teams is on GitHub.