Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. We explore the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We demo several applications we build for 3D manipulations, including area selection, 3D path specification, and a "Bejeweled" inspired game, showing potential for effortless drag-and-drop actions in 3D space. This demonstration includes the study apparatus and the applications from a paper that will be presented at UIST'24 with the same title (Wagner 2024, Proc. UIST).