吴福理, 丁胤, 丁维龙, 谢涛. 基于Leap Motion的虚拟农作物人机交互系统设计[J]. 农业工程学报, 2016, 32(23): 144-151. DOI: 10.11975/j.issn.1002-6819.2016.23.020
    引用本文: 吴福理, 丁胤, 丁维龙, 谢涛. 基于Leap Motion的虚拟农作物人机交互系统设计[J]. 农业工程学报, 2016, 32(23): 144-151. DOI: 10.11975/j.issn.1002-6819.2016.23.020
    Wu Fuli, Ding Yin, Ding Weilong, Xie Tao. Design of human computer interaction system of virtual crops based on Leap Motion[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2016, 32(23): 144-151. DOI: 10.11975/j.issn.1002-6819.2016.23.020
    Citation: Wu Fuli, Ding Yin, Ding Weilong, Xie Tao. Design of human computer interaction system of virtual crops based on Leap Motion[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2016, 32(23): 144-151. DOI: 10.11975/j.issn.1002-6819.2016.23.020

    基于Leap Motion的虚拟农作物人机交互系统设计

    Design of human computer interaction system of virtual crops based on Leap Motion

    • 摘要: 传统的虚拟植物人机交互系统一般运行在特定的操作系统或者移动平台上,而交互方式多是通过鼠标键盘进行人机交互,需要用户输入较为繁琐的参数和命令,导致系统用户缺乏良好的交互体验。针对上述情况,该本文基于云计算和体感交互技术,设计开发了虚拟农作物体感交互系统。系统先在云端计算生成虚拟农作物的三维模型,并将这些模型保存在云端。随后,由系统前端的Leap Motion体感控制器采集用户手部信息数据,通过本地计算机对数据进行处理,识别出各种各样的手势,再与浏览器上WebGL绘制成的虚拟植物进行实时交互,从而实现植物的平移、旋转、生长、形态变化以及三维模型的更新等操作。详细介绍了Leap Motion手势识别的基本原理,虚拟农作物体感交互系统的总体框架,农作物的测量和虚拟模型的生成,数据交互协议的设计原理,WebGL图形绘制以及单双手手势交互库的建立等关键技术。该系统已经由WebGL编程实现,试验结果表明,使用该文研发的系统,用户在浏览器上可以通过不同的手势对虚拟作物模型的形态和生长进行实时调控,具有良好的交互体验。该系统的建立,可为国内外同类研究提供技术上的参考。

       

      Abstract: Abstract: In recent years, the Somatosensory Technology has been applied in many fields including entertainment, education, automation and medicine etc. But in agriculture, it still has rarely involved. The traditional human-computer interaction system of virtual plant operating at a particular operating system or on a mobile platform, and interactive mode is interactive through the mouse and keyboard, need parameters and commands more cumbersome user input, resulting in the lack of good user interaction experience. In view of the above situation, in this paper, we designed and developed virtual farming object interaction system based on cloud computing and somatosensory interactive technology. The system firstly generated a 3D model of the virtual crop in the cloud, and the model was stored in the cloud. Our virtual crop model included rice and tomato. The cloud-side provided the data calculation ability and responded the browser requesting, the browser-side was responsible for display, caching and a small amount of calculation, and Leap Motion was responsible for interaction on the browser-side. In order to obtain the relevant parameters for rice modeling, we had done experiments in China National Rice Research Institute in Zhejiang, Hangzhou between 2015 and 2016. The selected rice stage was from the jointing stage to the heading stage. For each plant, we measured three blades in different leaf positions, which included blade lengths, blade widths, the changed widths along the blades, and blade growth positions. The 3D data of virtual crops needed to be generated by algorithms on Amazon cloud platform. The topological structures of tomato plants were described by the parametric L-system in our system, and we separated the structures into stems, rachis, blades, fruit branches and flower branches. Using WebGL to render 3D crop models on browser allowed and incorporated users to directly interact with it. In this paper, we defined a 3D virtual crop data exchange protocol, which supported crop species selection, morphological changes, crop growth, pause, shrinkage, and shadow generations. When the Leap Motion received a gesture signal, users can select crop specie, and then the server would generate the corresponding 3D crop data, including geometry, color and texture information, and returned it to browser in JSON format. We built a one-hand and a two-hand gesture base, and each gesture can realize the free interaction of 3D models. Users can freely switch between one-hand mode and two-hand mode. The interactive system of virtual crops based on Motion Leap was composed of the server and the browser. It achieved the translation, rotation, plant growth, morphological changes and 3D model update operation. Our hardware environment was: 1) quad-core Intel CPU 2.4 GHz, 8 G of RAM on the server-side; 2) Windows 7, quad-core CPU 2.6 GHz, 4 G of RAM, 2 GB NVIDIA GeForce GTX 860 M+Intel HD 4 600 on the browser-side. The system had been implemented in WebGL and JavaScript and effectively running on the major browsers including Firefox, Chrome and IE. Experimental results showed that the users can control the shape and growth of the virtual crop model in real time by using different gestures in the browser, from which they had a good interactive experience. The establishment of the system can provide a technical reference for similar research.

       

    /

    返回文章
    返回