Jump to content

Meta Neural Band

From VR & AR Wiki
Revision as of 21:59, 12 October 2025 by Xinreality (talk | contribs)

Property "Subtype" (as page type) with input value "Electromyography]] Wristband" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process. Property "Platform" (as page type) with input value "Meta Orion]] (prototype)" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process. Property "Price" (as page type) with input value "$799 USD (bundled with Meta Ray-Ban Display glasses)" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process. <ul><li>Property "Requires" (as page type) with input value "Meta Ray-Ban Display]] glasses" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process.</li> <!--br--><li>Property "Requires" (as page type) with input value "Meta AI]] mobile app" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process.</li></ul> Property "Precision" (as page type) with input value "High-precision finger tracking via [[sEMG" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process. Property "Tracking" (as page type) with input value "Electromyography]] (EMG) muscle signal tracking" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process. Property "Material" (as page type) with input value "Vectran]] (same material used in Mars Rover crash pads)" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process. <ul><li>Property "Sensors" (as page type) with input value "sEMG]] sensors (16 channels)" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process.</li> <!--br--><li>Property "Sensors" (as page type) with input value "IMU]] (accelerometer" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process.</li></ul> Property "Compliance" (as page type) with input value "IPX7]] water resistance" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process.

Meta Neural Band
Basic Info
VR/AR Virtual Reality, Augmented Reality, Mixed Reality
Type Input Device
Subtype Electromyography Wristband, Gesture Tracker, Neural Interface
Platform Meta Ray-Ban Display, Meta Orion (prototype)
Creator Thomas Reardon, Patrick Kaifosh
Developer Meta Platforms, Reality Labs, CTRL Labs
Manufacturer Meta Platforms
Announcement Date September 17, 2025
Release Date September 30, 2025
Price $799 USD (bundled with Meta Ray-Ban Display glasses)
Website https://www.meta.com/emerging-tech/emg-wearable-technology/
Versions Consumer version (2025)
Requires Meta Ray-Ban Display glasses, compatible smartphone, Meta AI mobile app
Predecessor CTRL Labs research prototypes
System
Operating System Works with iOS, Android
Storage
SD Card Slot No
Display
Display N/A
Precision High-precision finger tracking via sEMG
Image
Optics
Tracking
Tracking Electromyography (EMG) muscle signal tracking
Base Stations Not required
Eye Tracking No
Face Tracking No
Hand Tracking Yes (via muscle signals)
Body Tracking Wrist/hand only
Rotational Tracking Yes (wrist rotation)
Positional Tracking Limited (hand position inference)
Tracking Volume N/A
Play Space N/A
Latency Very low (can detect signals before visible movement)
Audio
Audio N/A
Microphone No
3.5mm Audio Jack No
Camera No
Connectivity
Connectivity Bluetooth
Ports Proprietary magnetic charging port
Wired Video N/A
Wireless Video N/A
WiFi No
Bluetooth Yes
Power Rechargeable battery
Battery Life Up to 18 hours
Device
Weight Lightweight (exact weight unspecified)
Material Vectran (same material used in Mars Rover crash pads)
Headstrap N/A
Haptics Yes (haptic feedback for gesture recognition)
Color Black, Sand
Sensors sEMG sensors (16 channels), IMU (accelerometer, gyroscope)
Input Gesture-based input via muscle signals
Compliance IPX7 water resistance
Size Three sizes available (Size 1, 2, 3)
Cable Length N/A


The Meta Neural Band is a wrist-worn electromyography (EMG) input device developed by Meta Platforms through its Reality Labs division. The device uses surface electromyography (sEMG) technology to detect electrical signals from muscle activity in the user's wrist and forearm, translating subtle finger movements and hand gestures into digital commands for controlling augmented reality (AR) and virtual reality (VR) devices.[1]

Announced at Meta Connect 2025 on September 17, 2025, and released on September 30, 2025, the Meta Neural Band represents the first consumer product to emerge from Meta's acquisition of CTRL Labs in 2019.[2] The device is exclusively bundled with Meta Ray-Ban Display glasses, priced at $799 USD for the combined package.[3]

History and Development

CTRL Labs Acquisition

The foundation for the Meta Neural Band traces back to September 2019, when Facebook (later renamed Meta) acquired CTRL Labs, a New York-based startup, for an estimated $500 million to $1 billion.[4] CTRL Labs had been developing wrist-based EMG technology since at least 2015, pioneering the use of muscle signal detection for computer input.[5]

Thomas Reardon, the co-founder and CEO of CTRL Labs and former architect of Internet Explorer, joined Meta as the director of Neuromotor Interfaces at Facebook Reality Labs following the acquisition.[6] Under his leadership, the project evolved from early research prototypes into a consumer-ready product.

Research and Development Timeline

Meta first publicly demonstrated its EMG wristband technology in March 2021, showcasing research prototypes capable of detecting individual neuron activity.[7] The company invested four years of development involving nearly 200,000 research participants to create machine learning models that could generalize across different users without requiring per-person calibration.[8]

In July 2025, Meta published peer-reviewed research in the scientific journal Nature, titled "A generic non-invasive neuromotor interface for human-computer interaction."[9] The paper detailed the technical achievements in creating a high-bandwidth neuromotor interface with out-of-the-box functionality, demonstrating handwriting input at 20.9 words per minute and gesture detection at 0.88 gestures per second.[10]

Product Evolution

The neural wristband was first demonstrated publicly as a prototype paired with Meta's Orion AR glasses at Meta Connect 2024 in September 2024.[11] This prototype, internally codenamed "Ceres," served as a proof of concept for the technology that would later ship with the Meta Ray-Ban Display glasses.[12]

Mark Zuckerberg indicated in early 2024 that the neural wristband would ship as a consumer product "in the next few years," with the actual release occurring in September 2025.[13]

Technology

Electromyography (sEMG)

The Meta Neural Band employs surface electromyography (sEMG) technology, which measures electrical activity produced by skeletal muscles.[14] Unlike traditional hand tracking methods that rely on camera-based computer vision, sEMG detects the electrical signals transmitted from the brain through the nervous system to control muscle movement in the fingers and hand.[15]

The wristband contains multiple sEMG sensors embedded around the circumference of the band, allowing it to capture muscle activation patterns as the user intends to move their fingers.[16] This approach offers several advantages over optical tracking:

  • Zero occlusion issues: Functions regardless of hand visibility or lighting conditions, including complete darkness[17]
  • Low latency: Can detect neural signals before fingers visibly move, potentially achieving negative latency[18]
  • Subtle input: Allows control with minimal movement, enabling interaction while hands rest at the user's side, in pockets, or on lap[19]
  • High precision: Provides accurate finger tracking without the need for exaggerated movements[20]

Machine Learning and Generalization

A key breakthrough in the Meta Neural Band is its ability to work "out of the box" for new users without requiring individual calibration or training.[21] Meta's research team collected sEMG data from approximately 10,000 to 200,000 consenting research participants and used deep learning techniques, including neural networks, to identify common patterns in electrical signals across different individuals.[22]

The system's machine learning models process EMG signals locally on-device, ensuring privacy as the raw neural data never leaves the wristband.[23] The models can be further personalized through continued use, potentially improving performance by 16% with user-specific adaptation.[24]

Design and Hardware

Physical Construction

The Meta Neural Band features a lightweight, flexible wristband design made from Vectran, a high-performance synthetic fiber also used in the crash pads of the Mars Rover.[25] This material provides exceptional strength when pulled while remaining soft and flexible for comfortable all-day wear.[26]

The device is available in three sizes (Size 1, 2, and 3) to accommodate different wrist dimensions.[27] Proper fit is critical for optimal sEMG signal detection, requiring professional fitting at authorized retailers before purchase.[28] The wristband is offered in two colors: Black and Sand, matching the available color options for Meta Ray-Ban Display glasses.[29]

Specifications

Feature Specification
Battery Life Up to 18 hours[30]
Water Resistance IPX7 rating[31]
Charging Proprietary magnetic charger (included)[32]
Connectivity Bluetooth[33]
Haptic Feedback Yes (immediate gesture recognition feedback)[34]
Sensors sEMG sensor array, IMU (accelerometer, gyroscope, orientation)[35]
Weight Lightweight (exact weight not disclosed)[36]
Sizes Available 3 (Small, Medium, Large)[37]

Gestures and Input Methods

Current Gesture Set

The Meta Neural Band recognizes four primary gestures in its initial consumer release:[38]

Gesture Function Description
Thumb-to-Index Pinch Click/Select Primary selection gesture for activating buttons and links[39]
Thumb-to-Middle Pinch Back/Menu Single tap returns to previous screen; double tap toggles display on/off[40]
Thumb Swipe Scroll Swiping thumb against side of index finger acts as virtual D-pad for scrolling content[41]
Pinch-and-Twist Volume/Zoom Pinching with thumb and index finger while rotating wrist adjusts volume or camera zoom[42]

All gestures can be performed with the hand at rest, at the user's side, in a pocket, or on the lap, without requiring the hand to be raised or visible to cameras.[43]

Future Input Methods

Meta announced plans to release a software update in December 2025 that will enable handwriting recognition by allowing users to trace letters on a physical surface (such as their leg) with their index finger.[44] Research demonstrations have shown the system capable of recognizing handwritten input at approximately 20.9 words per minute, with potential for improvement through personalization.[45]

Future capabilities may include virtual keyboard emulation, with Meta previously demonstrating prototypes capable of typing speeds approaching traditional keyboard input by around 2028.[46]

Compatibility and Requirements

Primary Platform

The Meta Neural Band is designed to work exclusively with Meta Ray-Ban Display glasses in its first consumer release.[47] The device requires:

Future Platforms

Meta has confirmed the Neural Band will serve as the primary input method for future AR products, including:

  • Orion AR Glasses: Meta's prototype true AR glasses demonstrated at Connect 2024, featuring 70-degree field of view[49]
  • Future AR Glasses: Consumer versions of true AR glasses planned for release around 2027[50]
  • Potential Standalone Use: Mark Zuckerberg has suggested the wristband could evolve into its own platform, potentially controlling devices beyond Meta's ecosystem[51]

Applications and Use Cases

Smart Glasses Control

The primary function of the Meta Neural Band is controlling Meta Ray-Ban Display glasses, which feature a monocular heads-up display (HUD) with 600×600 resolution and approximately 20-degree field of view.[52] Users can navigate the glasses' interface to:

  • Check notifications
  • View Meta AI responses with visual guidance
  • Access navigation with turn-by-turn directions
  • Control camera and preview photos
  • View real-time translations
  • Manage media playback[53]

Accessibility

The sEMG-based input method offers significant accessibility benefits for individuals with limited mobility.[54] The technology can detect intended movements even when users cannot produce large physical gestures, making it potentially useful for individuals with:

The ability to control devices with minimal movement while hands rest comfortably addresses challenges faced by users who experience pain or fatigue from traditional input methods.[56]

Reception

Hands-On Reviews

Early hands-on reviews from Meta Connect 2025 praised the Meta Neural Band's performance, with several reviewers describing it as working "like magic."[57] UploadVR reported a 100% gesture recognition success rate during their testing session, with immediate haptic feedback confirming each detected gesture.[58] Road to VR noted that the wristband allows "subtle inputs while your hand is down at your side" and called it "just as important to these new glasses as the display itself."[59]

Tom's Guide described the Meta Ray-Ban Display glasses as "the best smart glasses" they had tested, highlighting the "intuitive wrist gestures" as a key feature.[60]

Technical Achievement

The Nature publication of Meta's research was recognized as a significant milestone in brain-computer interface (BCI) and human-computer interaction (HCI) fields.[61] Dario Farina, a professor of bioengineering at Imperial College London, stated: "This idea – this kind of technology – is not new, it is decades old. The breakthrough here is that Meta has used artificial intelligence to analyse very large amounts of data, from thousands of individuals, and make this technology robust. It now performs at a level it has never reached before."[62]

Challenges

Some reviewers noted that while the Neural Band itself performed flawlessly, the initial Meta Ray-Ban Display glasses had limitations, including:

  • Monocular display causing disorientation for some users
  • Limited field of view (20 degrees)
  • Performance issues with the Qualcomm Snapdragon AR1 Gen 1 chipset
  • Weight increase compared to non-display Ray-Ban Meta glasses[63]

Availability and Pricing

Initial Release

The Meta Neural Band launched on September 30, 2025, exclusively bundled with Meta Ray-Ban Display glasses at a combined price of $799 USD.[64] The device cannot be purchased separately and is only available through authorized retailers in the United States, including:

Purchase requires scheduling an in-person demonstration and professional fitting to ensure proper wristband size selection.[66]

International Expansion

Meta announced plans to expand availability to Canada, France, Italy, and the United Kingdom in early 2026.[67] The company stated it would eventually sell the product online after gathering feedback from the initial retail-only launch.[68]

Competing Technologies

Mudra Link

At CES 2025, a competing device called Mudra Link was demonstrated, offering similar EMG-based gesture control for $199.[69] The Mudra Link claims cross-platform compatibility with various AR and XR devices, including TCL RayNeo X3 Pro and Apple Vision Pro, and includes pressure detection features not present in the initial Meta Neural Band release.[70]

Traditional Input Methods

The Meta Neural Band competes with several established input methods for AR/VR:

Scientific Contributions

Published Research

Meta released extensive documentation of its EMG research to the scientific community:[75]

  • Nature Paper: "A generic non-invasive neuromotor interface for human-computer interaction" (July 2025)[76]
  • Public Dataset: Over 100 hours of sEMG recordings from 100 participants per task type (discrete gestures, handwriting, wrist movements)[77]
  • Open Source Code: GitHub repository containing model implementations, training code, and evaluation tools[78]

The research demonstrated three key tasks:

  • Discrete Gesture Recognition: 0.88 gesture detections per second[79]
  • Continuous Navigation: 0.66 target acquisitions per second[80]
  • Handwriting Recognition: 20.9 words per minute (median performance)[81]

Contributors

The Nature paper listed contributions from Meta's Reality Labs team, with lead authors Patrick Kaifosh (Director of Research Science) and Thomas Reardon (Vice President of Neuromotor Interfaces).[82] The three-page contributor list indicated the extensive collaborative effort behind the project's development.[83]

Impact on Computing

Next-Generation Input Paradigm

Meta positions the Neural Band as representing a fundamental shift in human-computer interaction, replacing touchscreens, buttons, and traditional input devices with direct neural signal interpretation.[84] Mark Zuckerberg described it as potentially becoming "a pretty big deal" and suggested it could evolve beyond AR glasses to become a universal input method for controlling all digital devices, including smart home systems, gaming consoles, and computers.[85]

Privacy and Safety

Meta emphasizes that the Neural Band only reads outgoing motor signals from the brain to muscles, and cannot "read minds" or access thoughts.[86] Thomas Reardon clarified: "This is coming from the part of the brain that controls motor information, not thought. It sees what you are about to do."[87] All signal processing occurs locally on the device, with no raw EMG data transmitted to Meta's servers.[88]

See Also

References

  1. meta_official_emg
  2. meta_connect_2025
  3. uploadvr_handson
  4. ctrl_labs_acquisition
  5. techspot_emg
  6. reardon_meta
  7. meta_2021_demo
  8. nature_paper
  9. nature_publication
  10. nature_metrics
  11. orion_demo
  12. ceres_codename
  13. zuckerberg_timeline
  14. emg_technology
  15. semg_explanation
  16. sensor_array
  17. darkness_functionality
  18. negative_latency
  19. subtle_gesture
  20. precision_tracking
  21. no_calibration
  22. ml_training
  23. local_processing
  24. personalization_improvement
  25. vectran_material
  26. material_properties
  27. three_sizes
  28. fitting_requirement
  29. color_options
  30. battery_life
  31. ipx7_rating
  32. charging_method
  33. bluetooth_connectivity
  34. haptic_feedback
  35. sensor_types
  36. lightweight_design
  37. size_availability
  38. four_gestures
  39. pinch_click
  40. middle_pinch
  41. thumb_swipe
  42. pinch_twist
  43. hands_down_interaction
  44. handwriting_update
  45. handwriting_speed
  46. keyboard_future
  47. rayban_exclusive
  48. system_requirements
  49. orion_compatibility
  50. future_ar_glasses
  51. standalone_potential
  52. display_specs
  53. glasses_features
  54. accessibility_benefits
  55. accessibility_applications
  56. reduced_movement_benefit
  57. magic_reviews
  58. uploadvr_success_rate
  59. roadtovr_review
  60. tomsguide_review
  61. nature_significance
  62. farina_quote
  63. glasses_limitations
  64. launch_price
  65. retail_locations
  66. demo_requirement
  67. international_expansion
  68. online_future
  69. mudra_link
  70. mudra_features
  71. optical_comparison
  72. controller_comparison
  73. voice_comparison
  74. eye_tracking_comparison
  75. scientific_release
  76. nature_paper_full
  77. dataset_release
  78. github_repo
  79. gesture_metric
  80. navigation_metric
  81. handwriting_metric
  82. paper_authors
  83. contributor_list
  84. paradigm_shift
  85. zuckerberg_vision
  86. not_mind_reading
  87. reardon_clarification
  88. local_privacy

Cite error: <ref> tag with name "meta_official_emg" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "meta_connect_2025" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "uploadvr_handson" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "ctrl_labs_acquisition" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "techspot_emg" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "reardon_meta" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "meta_2021_demo" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "nature_paper" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "nature_publication" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "nature_metrics" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "orion_demo" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "ceres_codename" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "zuckerberg_timeline" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "emg_technology" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "semg_explanation" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "sensor_array" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "darkness_functionality" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "negative_latency" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "subtle_gesture" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "precision_tracking" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "no_calibration" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "ml_training" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "local_processing" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "personalization_improvement" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "vectran_material" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "material_properties" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "three_sizes" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "fitting_requirement" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "color_options" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "battery_life" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "ipx7_rating" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "charging_method" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "bluetooth_connectivity" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "haptic_feedback" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "sensor_types" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "lightweight_design" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "size_availability" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "four_gestures" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "pinch_click" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "middle_pinch" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "thumb_swipe" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "pinch_twist" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "hands_down_interaction" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "handwriting_update" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "handwriting_speed" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "keyboard_future" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "rayban_exclusive" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "system_requirements" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "orion_compatibility" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "future_ar_glasses" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "standalone_potential" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "display_specs" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "glasses_features" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "accessibility_benefits" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "accessibility_applications" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "reduced_movement_benefit" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "magic_reviews" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "uploadvr_success_rate" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "roadtovr_review" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "tomsguide_review" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "nature_significance" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "farina_quote" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "glasses_limitations" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "launch_price" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "retail_locations" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "demo_requirement" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "international_expansion" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "online_future" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "mudra_link" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "mudra_features" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "optical_comparison" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "controller_comparison" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "voice_comparison" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "eye_tracking_comparison" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "scientific_release" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "nature_paper_full" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "dataset_release" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "github_repo" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "gesture_metric" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "navigation_metric" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "handwriting_metric" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "paper_authors" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "contributor_list" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "paradigm_shift" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "zuckerberg_vision" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "not_mind_reading" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "reardon_clarification" defined in <references> is not used in prior text.
Cite error: <ref> tag with name "local_privacy" defined in <references> is not used in prior text. XXXXXX