Path: ...!2.eu.feeder.erje.net!feeder.erje.net!fu-berlin.de!uni-berlin.de!not-for-mail From: marc nicole Newsgroups: comp.lang.python Subject: How to go about a simple object grabbing in python (given coordinates of arms and objects) Date: Sat, 22 Jun 2024 14:41:47 +0200 Lines: 47 Message-ID: References: Mime-Version: 1.0 Content-Type: text/plain; charset="UTF-8" X-Trace: news.uni-berlin.de bsgAIlwFVzYrn6BTj+x8sQoaC+gFBz/IPPF/6bkestWg== Cancel-Lock: sha1:EFryF6NwGGqAy2lFLiFWsX7VkWE= sha256:PIgrTLpRmkdtIJhZirpD+U/NmWNXIg+sH7r5FeV34wM= Return-Path: X-Original-To: python-list@python.org Delivered-To: python-list@mail.python.org Authentication-Results: mail.python.org; dkim=pass reason="2048-bit key; unprotected key" header.d=gmail.com header.i=@gmail.com header.b=nsj8u3/X; dkim-adsp=pass; dkim-atps=neutral X-Spam-Status: UNSURE 0.281 X-Spam-Level: ** X-Spam-Evidence: '*H*': 0.45; '*S*': 0.02; 'subject:python': 0.06; 'case?': 0.16; 'frame,': 0.16; 'src:': 0.16; 'subject:simple': 0.16; 'url:doc': 0.16; 'problem': 0.16; "can't": 0.17; 'figure': 0.19; 'to:addr:python-list': 0.20; 'to:no real name:2**1': 0.22; 'subject:How': 0.23; ' List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Mailman-Original-Message-ID: Bytes: 6698 Hello to all of this magnificent community! I have this problem I had already spent a few days on and still can't figure out a proper solution. So, given the x,y,z coordinates of a target object and the offset x,y,z of arms of a robot, what is a good algorithm to perform to grab the object between the hands (either from both sides or from below all using both hands). Specifically, my problem is applied to a NAO robot environment where I retrieve a target object coordinates using the following code: tracker_service= session.service("ALTracker") xyz_pos = tracker_service.getTargetPosition(motion.FRAME_TORSO) src: http://doc.aldebaran.com/2-8/naoqi/motion/control-cartesian.html#motion-cartesian-effectors Then I get to move the right arm towards nearby the object using the following code: effector = "RArm" frame = motion.FRAME_TORSO effector_offset = almath.Transform(self.motion.getTransform(effector, frame, False)) effector_init_3d_position = almath.position3DFromTransform( effector_offset) target_3d_position = almath.Position3D(target_position) move_3d = target_3d_position - effector_init_3d_position moveTransform = almath.Transform.fromPosition(move_3d.x, move_3d.y, move_3d.z) target_transformer_list = list(moveTransform.toVector()) times = [2.0] axis_mask_list = motion.AXIS_MASK_VEL self.motion.transformInterpolations(effector, frame, target_transformer_list, axis_mask_list, times). src: http://doc.aldebaran.com/1-14/dev/python/examples/almath/index.html?highlight=offset This question is specific to NAO environment but in general how to go about this task? what is a most common algorithm used in this case? Do I have to also get the side of the object in order to know where exactly the arms should be placed?