Home › Forum › SOFA › Using SOFA › How to interact with 3D deformable objects in SOFA
- This topic has 8 replies, 2 voices, and was last updated 6 years, 11 months ago by Hugo.
-
AuthorPosts
-
21 December 2017 at 20:51 #10245SarathBlocked
I am using SOFA for simulating a virtual surgical scenario, However I am not sure how to do this. I am planing on using either a mouse or a haptic device to control a surgical knife or a tool to interact with deformable tissue. so far my understanding is that we use something called mechanical state controller to control any object inside the simulation .
Example: This is an example from haption1.scn of the Haption plugin
<MechanicalStateController template="Rigid" name="mechanicalStateController1" buttonDeviceState="@haptionDriver1.state_button" mainDirection="0 0 -1" />
However I could not find any example on how to do the same with a mouse. My goal ultimately is to use a haptic device but over the VRPN(Virtual Reality Protocol Network). If anyone has done this before or have any idea regarding this please do share the information .
Thank you
3 January 2018 at 18:07 #10251SarathBlockedUpdate,
I am still haven’t found any useful info anywhere , it would be of great help if anyone could post some relevant info on this .
Thank you.
8 January 2018 at 22:47 #10281HugoKeymasterDear @sarath,
Sorry for not having been able to assist you earlier.
I would like to wish you and all SOFA users a happy new year 2018!From what I understood you aim at interacting with deformable object, right?
One solution is to use the mouse to move one object that collides with its environment. The scene below implements such a case:<Node name="root" dt="0.005" gravity="0 0 0"> <VisualStyle displayFlags="showCollisionModels hideVisualModels" /> <DefaultAnimationLoop /> <CollisionPipeline verbose="0" draw="0"/> <BruteForceDetection name="N2"/> <NewProximityIntersection name="Proximity" alarmDistance="1" contactDistance="0.4"/> <CollisionResponse name="Response" response="default"/> <MeshGmshLoader name="meshLoader" filename="mesh/liver.msh" /> <Node name="Liver"> <EulerImplicitSolver rayleighStiffness="0"/> <CGLinearSolver iterations="200" tolerance="1e-06" threshold="1e-06"/> <TetrahedronSetTopologyContainer name="topo" src="@../meshLoader" /> <TetrahedronSetGeometryAlgorithms template="Vec3d" name="GeomAlgo" /> <MechanicalObject template="Vec3d" name="myLiver" showObject="1"/> <TetrahedronFEMForceField name="FEM" youngModulus="40000" poissonRatio="0.4" method="large" /> <MeshMatrixMass massDensity="1" /> <FixedConstraint indices="1 3 50 20 25" /> <Node name="Collision"> <MeshGmshLoader name="meshLoader" filename="mesh/liver.msh" /> <Mesh src="@meshLoader"/> <MechanicalObject scale="1.0"/> <Triangle name="CollisionModel" contactStiffness="5"/> <BarycentricMapping name="CollisionMapping" /> </Node> </Node> <Node name="Sphere"> <EulerImplicitSolver rayleighStiffness="0"/> <CGLinearSolver iterations="200" tolerance="1e-06" threshold="1e-06"/> <MechanicalObject template="Rigid" name="myParticle" position="-2 10 0 0 0 0 1" showObject="1" /> <UniformMass totalmass="1" /> <Sphere name="Floor" listRadius="1" simulated="1" moving="1" contactStiffness="1000" /> </Node> </Node>
Or you can use an haptic interface. I would suggest for you to have a look at the Geomagic plugin, to the scene applications/plugins/Geomagic/scenes/Geomagic-FEMLiver.scn It’s a really good example of haptic interaction with an hardware interface.
Hope this helps.
Best,Hugo
9 January 2018 at 16:56 #10301SarathBlockedHi Hugo,
I tried to implement the above scene but for some reason my mouse is not being associated with the sphere , is it because of some driver issue? or is there a way in the software to assign the object to the mouse?. I forgot to mention I am running sofa remotely on a computing grid is that why this is happening ? . If anyone has done this before please let me know .
Thank you.
9 January 2018 at 17:35 #10302HugoKeymasterHi Sarath
This might be normal : you need to grab the sphere using the shift+left clic
BestHugo
9 January 2018 at 18:05 #10304SarathBlockedHi hugo,
Great it works!. I am a bit surprised as I thought it might need some lines of code for the mouse to be engaged. I say this because I was hoping I could find the related lines and implement the same for the Haptic device . Now, I guess I will look into the Geomagic example but my issue is that I am not using the geomagic locally , I would be using it over vrpn , and I see that no code has been developed for it so that this could be achieved ( i.e. using the haptic device locally and sofa on a remote desktop). I will try to develop something, but I have a small doubt in this regard, as far as I can understand the Mechanical state controller is what drives mechanical objects in sofa , I see these used in the geomagic example as well, as shown below :
(taken from Geomagic-FEMliver.scn)
<!-- ADDED: the Mechanical state Controller gathers events from the Omni driver and populates the Mechanical state --> <Node name="Omni"> <MechanicalObject template="Rigid" name="DOFs" position="@GeomagicDevice.positionDevice"/> <MechanicalStateController template="Rigid" listening="true" mainDirection="-1.0 0.0 0.0" handleEventTriggersUpdate="true"/>
Could you may be in brief explain the usage of this and how it would access the drivers of geomagic?.
I do have an example in the deprecated SofaVRPNCLient , but its not related to haptic device , it is related to something called VRPN imager which deals with image transmission over VRPN, it looks something like this (which is kind of similar to the Mechanical state controller).
(taken from VRPNImagerRigidPhysics.scn (Deprecated))
<Node name="tracking"> <VRPNImager name="imageData" template="Rigid" listening="true" serverName="192.170.207.244" serverPort="3883" deviceName="TestImage" /> </Node> <Node name="proxy"> <!--<TransformEngine name="transform" template="Rigid" rotation="0 0 0" translation="0.42 0.095 0" scale="1.0 1.0 1.0" input_position="@../tracking/imageData.rigidPosition"/> <MechanicalObject name="mechCenterTool" template="Rigid" position="@[-1].output_position" debugViewIndices="true" />-->
I hope I am not confusing you, I am just trying to understand how geomagic is implemented so I could do something similar with vrpn , write some code if needed in this regard. My hope is that I might have to do something like this :
<Node name="tracking"> <VRPNButton name="Phantom" template="Rigid" listening="true" serverName="192.170.xxx.xxx" serverPort="3883" deviceName="Phantom0" /> </Node>
But I am not sure of this. Let me know your thoughts on this .
Thank you.
12 January 2018 at 13:17 #10332HugoKeymasterDear Sarath
I am indeed not sure to understand.
In simulations, you are computing at each step of the simulation the evolution of the state of your object. The state or degrees of freedom (e.g. positions) are stored in the MechanicalObject. The physics defined in your simulation (internal and external forces) will make the degrees of freedom change in time.
In the case you want an haptic device (e.g. Geomagic) to control (master) an object in the scene, what is done is that you have:
– a driver (GeomagicDriver) recovering the information from the device (the position in 3D space of the device)
– a Rigid point, stored into a MechanicalObject, that will have the same position than one given by the device.
Therefore we have:<!-- To load the plugin --> <RequiredPlugin pluginName="Geomagic"/> <!-- To recover the information from the device --> <GeomagicDriver name="GeomagicDevice" deviceName="Default Device" positionBase="0 0 0" orientationBase="0 0 0 1" /> <!-- a rigid particle storing the position (Rigid) of the device --> <MechanicalObject template="Rigid" name="GeomagicMO" position="@GeomagicDevice.positionDevice" />
What kind of information do you recover through VRPN ?
12 January 2018 at 23:31 #10335SarathBlockedHi Hugo,
Sorry to confuse you, But the basic Idea is We recover the information from the Device ( the geomagic/Omni/hapticdevice) , the same 3D position information as we do for geomagic plugin , from VRPN instead ( this is because the device is not local to the machine). VRPN transmits this information in realtime through TCP/IP , The deprecated VRPN plugin doesn’t have the required code for doing this, however it does have the basic structure to implement VRPN in sofa. The process looks like this
(VRPN)
Local Machine ——————> Computing grid (This is one way scenario)
(Haptic Device)_________________(SOFA software)The 2 way scenario is quite complex and it probably require some code on the local machine as well to sync the feed back to the haptic device . But this is the basic idea. Hope this illustrates the problem a bit.
16 January 2018 at 20:40 #10340HugoKeymasterAlright, I guess it can be done similarly to the Geomagic plugin if you know what kind of data you receive through the VRPN connection.
But I’m sorry I am really no expert in this field. Maybe some other devs in the community would better answer this. -
AuthorPosts
- You must be logged in to reply to this topic.