Abstract:

Fingerprinting physical items to mint NFTs is described. One or more features of a physical item are captured using a fingerprint capture system of a client device, and a fingerprint of the physical item is generated using the captured features of the physical item. The fingerprint of the physical item is provided to an authentication service to verify that the physical item corresponds to an authentic physical item by matching the fingerprint of the physical item to distinguishing features of the authentic physical item. Responsive to verification by the authentication service, a digital twin NFT is minted on a blockchain using the matched fingerprint. A combined listing for the physical item and the digital twin NFT is then generated on a listing platform.

Country: United States
Grant Date: September 12, 2023
INVENTORS: Andrew Chalkley, Chris Matthews, Paul Stathacopoulos, Shannon Vosseller

Abstract:

Techniques are described, as implemented by computing devices, to control access to transactions through use of tokenized reputation scores. This is performed by leveraging a blockchain such that a tokenized reputation score is generated or calculated based on an amount of reputation tokens associated with a blockchain account address associated with a service provider account, and by making transactional functionality available to the service provider account based on a comparison of a tokenized reputation score affiliated with the service provider account with a threshold score associated with a transaction.

Country: United States
Grant Date: September 5, 2023
INVENTORS: Andrew Chalkley, Shannon Vosseller

Abstract:

An augmented reality or virtual reality (AR/VR) device pairs with a companion device to augment input interfaces associated with an AR/VR application at the AR/VR device. In implementations, an AR/VR device determines a portion of a markup file that corresponds to an AR/VR scene of a plurality of AR/VR scenes in an AR/VR environment, and communicates the portion of the markup file to the companion device to cause the companion device to configure a companion user interface associated with initiating an action as part of the AR/VR scene. In response to receiving user input via the companion user interface, the companion device communicates the action to the AR/VR device to initiate the action. The AR/VR device receives input data from the companion device, and initiates the action for the AR/VR scene.

Country: United States
Grant Date: May 16, 2023
INVENTORS: Andrew Chalkley, Joshua Timonen

Abstract:

Techniques for automatically modulating a physical configuration of a reconfigurable building structure. A reconfigurable building structure may be constructed of physical elements that are movable with respect to one another to facilitate actuating the reconfigurable building structure between a plurality of different physical configurations. The physical configuration of a reconfigurable building structure may be adjusted to accommodate for physical dimensions of an item that is going to be moved into the reconfigurable building structure. For example, a spacing between two shelves may be expanded in response to an order being placed for a large item. In this way, when the item is delivered to a physical address associated with the reconfigurable building structure, various physical characteristics of the reconfigurable building structure may have already been modulated to accept the item.

Country: United States
Grant Date: February 7, 2023
INVENTORS: DILEEP KUMAR BASAM, Andrew Chalkley, Ethan Rubinson, Jean-David Ruvini, Bindia Saraf, Qiaosong Wang

Abstract:

A system described herein uses data obtained from a wearable device of a first user to identify a second user and/or to determine that the first user is within a threshold distance of the second user. The system can then access an account of the second user to identify one or more items and retrieve model data for the item(s). The system causes the wearable device of the first user to render, for display in an immersive 3D environment (e.g., an augmented reality environment), an item associated with the account of the second user. The item can be rendered for display at a location on a display that is proximate to the second user (e.g., within a threshold distance of the second user) such that the item graphically corresponds to the second user. The item rendered for display may be an item of interest to the first user.

Country: United States
Grant Date: January 17, 2023
INVENTORS: David Beach, Andrew Chalkley, Joshua Timonen, Steve Yankovich

Abstract:

An augmented reality or virtual reality (AR/VR) device pairs with a companion device to augment input interfaces associated with an AR/VR application at the AR/VR device. In implementations, an AR/VR device determines a portion of a markup file that corresponds to an AR/VR scene of a plurality of AR/VR scenes in an AR/VR environment, and communicates the portion of the markup file to the companion device to cause the companion device to configure a companion user interface associated with initiating an action as part of the AR/VR scene. In response to receiving user input via the companion user interface, the companion device communicates the action to the AR/VR device to initiate the action. The AR/VR device receives input data from the companion device, and initiates the action for the AR/VR scene.

Country: United States
Grant Date: April 5, 2022
INVENTORS: Andrew Chalkley, Joshua Timonen

Abstract:

Model placement metadata is defined and stored for a three-dimensional (?3D?) model. The model placement metadata specifies constraints on the presentation of the 3D model when rendered in a view of a real-world environment, such as a view of a real-world environment generated by wearable computing device like an augmented reality (?AR?) or virtual reality (?VR?) device. A wearable computing device can analyze the geometry of a real-world environment to determine a configuration for the 3D model that satisfies the constraints set forth by the model placement metadata when the 3D model is rendered in a view of the environment. Once the configuration for the 3D model has been computed, the wearable device can render the 3D model according to the displayed configuration and display the rendering in a view of the real-world environment.

Country: United States
Grant Date: December 14, 2021
INVENTORS: Andrew Chalkley, Steve Yankovich

Abstract:

Techniques for automatically modulating a physical configuration of a reconfigurable building structure. A reconfigurable building structure may be constructed of physical elements that are movable with respect to one another to facilitate actuating the reconfigurable building structure between a plurality of different physical configurations. The physical configuration of a reconfigurable building structure may be adjusted to accommodate for physical dimensions of an item that is going to be moved into the reconfigurable building structure. For example, a spacing between two shelves may be expanded in response to an order being placed for a large item. In this way, when the item is delivered to a physical address associated with the reconfigurable building structure, various physical characteristics of the reconfigurable building structure may have already been modulated to accept the item.

Country: United States
Grant Date: November 9, 2021
INVENTORS: DILEEP KUMAR BASAM, Andrew Chalkley, Ethan Rubinson, Jean-David Ruvini, Bindia Saraf, Qiaosong Wang

Abstract:

An augmented reality or virtual reality (AR/VR) device pairs with a companion device to augment input interfaces associated with an AR/VR application at the AR/VR device. In implementations, an AR/VR device determines a portion of a markup file that corresponds to an AR/VR scene of a plurality of AR/VR scenes in an AR/VR environment, and communicates the portion of the markup file to the companion device to cause the companion device to configure a companion user interface associated with initiating an action as part of the AR/VR scene. In response to receiving user input via the companion user interface, the companion device communicates the action to the AR/VR device to initiate the action. The AR/VR device receives input data from the companion device, and initiates the action for the AR/VR scene.

Country: United States
Grant Date: October 19, 2021
INVENTORS: Andrew Chalkley, Joshua Timonen

Abstract:

The disclosed technologies include a robotic selling assistant that receives an item from a seller, automatically generates a posting describing the item for sale, stores the item until it is sold, and delivers or sends the item out for delivery. The item is placed in a compartment that uses one or more sensors to identify the item, retrieve supplemental information about the item, and take pictures of the item for inclusion in the posting. A seller-supplied description of the item may be verified based on the retrieved supplemental information, preventing mislabeled items from being sold.

Country: United States
Grant Date: June 1, 2021
INVENTORS: Andrew Chalkley, Robinson Piramuthu, Bindia Saraf, Qiaosong Wang

Abstract:

Model placement metadata is defined and stored for a three-dimensional (?3D?) model. The model placement metadata specifies constraints on the presentation of the 3D model when rendered in a view of a real-world environment, such as a view of a real-world environment generated by wearable computing device like an augmented reality (?AR?) or virtual reality (?VR?) device. A wearable computing device can analyze the geometry of a real-world environment to determine a configuration for the 3D model that satisfies the constraints set forth by the model placement metadata when the 3D model is rendered in a view of the environment. Once the configuration for the 3D model has been computed, the wearable device can render the 3D model according to the displayed configuration and display the rendering in a view of the real-world environment.

Country: United States
Grant Date: January 12, 2021
INVENTORS: Andrew Chalkley, Steve Yankovich

Abstract:

An augmented reality or virtual reality (AR/VR) device pairs with a companion device to augment input interfaces associated with an AR/VR application at the AR/VR device. In implementations, an AR/VR device determines a portion of a markup file that corresponds to an AR/VR scene of a plurality of AR/VR scenes in an AR/VR environment, and communicates the portion of the markup file to the companion device to cause the companion device to configure a companion user interface associated with initiating an action as part of the AR/VR scene. In response to receiving user input via the companion user interface, the companion device communicates the action to the AR/VR device to initiate the action. The AR/VR device receives input data from the companion device, and initiates the action for the AR/VR scene.

Country: United States
Grant Date: January 12, 2021
INVENTORS: Andrew Chalkley, Joshua Timonen

Andrew Chalkley