CN101344816A - Human-machine interaction method and device based on sight tracing and gesture discriminating - Google Patents
Human-machine interaction method and device based on sight tracing and gesture discriminating Download PDFInfo
- Publication number
- CN101344816A CN101344816A CNA2008100301944A CN200810030194A CN101344816A CN 101344816 A CN101344816 A CN 101344816A CN A2008100301944 A CNA2008100301944 A CN A2008100301944A CN 200810030194 A CN200810030194 A CN 200810030194A CN 101344816 A CN101344816 A CN 101344816A
- Authority
- CN
- China
- Prior art keywords
- image
- screen
- finger
- people
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
The invention discloses a human-computer interaction method and a device based on vision follow-up and gesture identification. The method comprises the following steps of: facial area detection, hand area detection, eye location, fingertip location, screen location and gesture identification. A straight line is determined between an eye and a fingertip; the position where the straight line intersects with the screen is transformed into the logic coordinate of the mouse on the screen, and simultaneously the clicking operation of the mouse is simulated by judging the pressing action of the finger. The device comprises an image collection module, an image processing module and a wireless transmission module. First, the image of a user is collected at real time by a camera and then analyzed and processed by using an image processing algorithm to transform positions the user points to the screen and gesture changes into logic coordinates and control orders of the computer on the screen; and then the processing results are transmitted to the computer through the wireless transmission module. The invention provides a natural, intuitive and simple human-computer interaction method, which can realize remote operation of computers.
Description
Technical field
The present invention relates to the human-computer interaction technology in the virtual reality system, specifically is a kind of man-machine interaction method and device based on eye tracking and gesture identification.
Background technology
Along with rapid development of computer technology, the interacting activity of people and computing machine becomes an important component part of people's daily life gradually.There is certain limitation in traditional human-computer interaction device such as mouse, keyboard at aspects such as naturality of using and friendly, and therefore studying the human-computer interaction technology that meets the interpersonal communication custom becomes current development trend.
All have naturality, substantivity and terseness based on eye tracking with based on the man-machine interaction mode of gesture identification.Human-computer interaction technology based on eye tracking mainly obtains the position that the user watches attentively by the rotation information of obtaining eyeball, and then realizes the control to computing machine.It mainly is divided into contact and contactless two classes.The matching requirements user of contact wears special equipment to detect eyeball information, and this will bring very big interference to the user.Contactless device resolution precision is limited, and the user must be apart from camera nearer and head deflection can not be excessive.Human-computer interaction technology based on gesture identification mainly changes by identification user's gesture, judges the operation that the user need carry out.It is divided into based on data glove with based on computer vision two classes.There are shortcomings such as burdensome, that motion is dumb in method based on data glove.Can not directly locate by the sensing of finger based on the method for computer vision, must change by specific gesture and obtain relative position, and require the user must be nearer apart from camera.So, traditional all there is certain defective based on eye tracking with based on the human-computer interaction technology of gesture identification, can not well solve remote mutual between man-machine, for example people need the remote-controlled operation computing machine when utilizing the giant-screen speech.
Summary of the invention
The objective of the invention is to overcome the above-mentioned defective that prior art exists, man-machine interaction method and device based on eye tracking and gesture identification are provided, can realize the remote-controlled operation computing machine.The user needn't carry other any equipment, only needs finger by nature to point to and click action can realize control to computing machine.Of the present invention being achieved through the following technical solutions.
Man-machine interaction method based on eye tracking and gesture identification comprises step: the location of the detection in the detection of human face region, hand zone, the location of human eye, finger tip, screen location and gesture identification.
Described screen location comprises: when the user stretched out a finger and points to screen, facial image that system collects according to image collecting device and hand image-region area size calculated human eye and the finger tip distance to screen; Coordinate conversion in image is human eye and the coordinate of finger tip in the three-dimensional coordinate system that is changed to initial point with image collector with human eye and finger tip; By human eye and 2 definite straight lines of finger tip, the point that this straight line and screen intersect is exactly the position that the user points to screen, calculates the coordinate of this position in described three-dimensional coordinate system according to the proportionate relationship of people's eye coordinates and finger tip coordinate; According to the size of screen, be the logical coordinates of mouse on screen with the coordinate transformation of described position;
The clicking operation of described gesture identification by judging that the finger click action is come analog mouse pointed to screen and when mobile, will be considered as mouse moving when the user stretches out finger of the right hand; When this right finger closes for the first time, will be considered as pressing left mouse button; To be considered as pin left button rolling mouse if right finger is stretched out sensing screen and mobile this moment; When this right finger closes once more, will be considered as having discharged left button; Point to screen when the user stretches out finger of left hand, when then left-hand finger being closed, will be considered as pressing right mouse button; After left-hand finger is stretched out, will be considered as discharging right mouse button.
In the said method, whether the detection step of described human face region comprises: have people's face to exist in the image by judging based on Adaboost people's face detection algorithm of class rectangular characteristic, the integrogram of computed image at first, extract the class rectangular characteristic, according to the sorter feature database that has trained, the method for utilization Cascade cascade is searched for human face region in image.The training step of described sorter feature database comprises: calculate the integrogram of sample image, extract the class rectangular characteristic of sample image; Screen effective feature according to the Adaboost algorithm, constitute Weak Classifier; By making up a plurality of Weak Classifiers, constitute strong classifier; The a plurality of strong classifiers of cascade form the sorter feature database that people's face detects.
In the said method, the detection step in described hand zone comprises: according to the features of skin colors of people's face, by the zone that is complementary in the colour of skin matching process searching image; After tentatively cutting apart the zone of selling,, remove background interference according to the area size of connected domain, thereby detect the zone of selling according to the position removal people's face of human face region and the interference region of neck.
In the said method, the positioning step of described human eye comprises: detecting on the basis of human face region, facial image is done horizontal Gray Projection,, on horizontal Gray Projection curve, search for each local smallest point and judge whether to be human eye area then according to the face feature of people's face; After detecting human eye area, with this regional mid point as people's eye coordinates.
In the said method, described finger tip positioning step comprises: at first the image in adversary zone carries out rim detection and grid sample process; Each pixel with the handwheel exterior feature after the sampling is the center, and it is right to choose four pairs of pixels of 4 adjacent pixels formations in counterclockwise and clockwise direction respectively; Calculate the distance variance between every pair of pixel respectively, the mean distance variance is minimum and be fingertip area less than the sampled pixel of threshold value; After detecting fingertip area, with this regional mid point as the finger tip coordinate.
A kind of device of realizing said method, it comprises image capture module, image processing module and wireless transport module, the camera in the image capture module are positioned over screen upper end central authorities, are responsible for gathering user's image and being input in the image processing module; Image processing module is responsible for controlling other two modules, moves various image processing algorithms the user images of gathering is carried out analyzing and processing, the user is pointed to the position of screen and the variation of gesture is converted into logical coordinates and the steering order of computing machine on screen; Wireless transport module comprises receiver module and sending module, and sending module is connected with image processing module, is responsible for result is arrived receiver module by radio signal transmission; Receiver module links to each other with computing machine, is responsible for that result is converted into the mouse control signal and is input in the computing machine.
Described image capture module comprises a camera, and described image processing module comprises a flush bonding processor and peripheral components, and described receiver module and sending module all include radio frequency chip and single-chip microcomputer.
Compared with prior art, the present invention has following advantage and effect: the present invention is a kind of natural, man-machine interaction mode intuitively, and the user need not carry other any equipment, need not to remember complicated operations, only need to point to and click action, can realize control computing machine by the finger of nature; The present invention combines eye tracking and two kinds of technology of gesture identification, solved traditional based on the eye tracking technology with based on needing to wear special equipment in the Gesture Recognition, limited subscriber uses defective freely, a kind of mode of operation simply freely is provided, can be used for the far distance controlled computing machine; Device volume of the present invention is small and exquisite, easy to use, only needs camera is placed on screen upper end central authorities, and wireless communication module is connected to computing machine, can use immediately.
Description of drawings
Fig. 1 is the hardware configuration synoptic diagram in the specific embodiment of the invention.
Fig. 2 is the user mode synoptic diagram in the embodiment of the present invention.
Fig. 3 is the workflow synoptic diagram in the specific embodiment of the invention.
Fig. 4 a and Fig. 4 b be respectively human eye and screen in the three-dimensional coordinate system that with the camera is initial point X-Z and the location diagram in the Y-Z plane.
Fig. 5 is the coordinate setting model according to human eye and finger tip coordinate setting screen.
Embodiment
Below in conjunction with accompanying drawing the specific embodiment of the present invention is described further.
Mainly by image capture module, image processing module and wireless communication module three parts constitute, as shown in Figure 1 based on the man-machine interactive system of eye tracking and gesture identification.Image capture module comprises a camera, is responsible for gathering user's image in real time and being transferred in the image processing module.Image processing module is made up of high performance flush bonding processor and peripheral components, be responsible for other two modules of control, move various image processing algorithms the user images of gathering is carried out analyzing and processing, the user is pointed to the position of screen and the variation of gesture is converted into logical coordinates and the control command of computing machine on screen.Wireless communication module is divided into transmission and receives two parts, and two parts constitute by single-chip microcomputer and radio frequency chip.Wireless communication module is responsible for the result in the images processing module, and this result is converted into the mouse control signal is input in the computing machine.
As shown in Figure 2, user's image is gathered in camera 1 centre position that is positioned over giant-screen 2 upper ends in real time.When the user needs certain position that rolling mouse arrives, only need to point to that this position get final product on the screens by right finger 3, human eye 4 and 2 definite straight lines of finger 3 finger tips, the crossing point of this straight line and screen is exactly the position that the user points to screen; When the user needs the left mouse button operation, only need click action by right finger; When the user needs the right mouse button operation, only need click action by left-hand finger.
The specific embodiment of the present invention flow process as shown in Figure 3.At first, according to people's face detection algorithm this image is analyzed then by the camera collection user images.Currently whether have the user to use this system by whether existing people's face to judge in the detected image, only detect people's face after, just carry out follow-up processing.On the basis that detects people's face, according to the human eye location algorithm, search for the human eye area in this facial image, obtain people's eye coordinates.By the hand detection algorithm image is analyzed, detected the hand zone.On the basis that gets zone in one's hands, utilize the finger tip location algorithm to analyze the image in hand zone, obtain the finger tip coordinate.If system detects zone in one's hands and navigates to finger tip, think that then the user has pointed to screen, system will calculate the coordinate of mouse on screen according to coordinate conversion model and location model.If system detects zone in one's hands but does not navigate to finger tip, represent then that the user points to close, click action has taken place, can obtain the operational order of computing machine by identification user's click action.Behind intact this two field picture of system handles, with result by radio signal transmission to the wireless receiving module of computing machine in, this module is responsible for that result is converted into the mouse control signal and is input in the computing machine.
In the present embodiment, people's face detection algorithm adopts and based on Adaboost people's face detection algorithm of class rectangular characteristic image is analyzed.System is the integrogram of computed image at first, extracts the class rectangular characteristic.Utilize the sorter feature database that has trained then, the method for utilization Cascade cascade is searched for human face region in image.The sorter feature database that native system uses is made of 22 grades of strong classifiers, and each strong classifier is made of several Weak Classifiers again.System at first intercepts all subwindows of 80 * 80 in the entire image, and each subwindow is eliminated non-face subwindow step by step successively by cascade classifier.If have only a subwindow, then determine this window behaviour face window by whole 22 grades of sorters; If any a plurality of subwindows by whole 22 grades of sorters, a plurality of face windows of waiting to choose are carried out adjacent subwindow merge, select beautiful woman's face window.If do not detect the subwindow that meets, then the subwindow size increases progressively with 1.1 times, and detects by cascade classifier again.
In the present embodiment, the training of sorter feature database is an off-line training, needs a large amount of people's face and non-face sample.Because human face region is generally square area, so at first choose the square pixels zone of sample image, calculates this regional integrogram and class rectangular characteristic.Carry out the screening of feature then according to the AdaBoost algorithm, select effective feature and sets of threshold values to become Weak Classifier, constitute strong classifier by making up a plurality of Weak Classifiers.According to the method for Cascade cascade, as the feature of first strong classifier, the strong classifier that more features are formed detects sorter as further detecting by people's face of a plurality of strong classifiers formations of cascade then with the most tangible two features in people's face.
In the present embodiment, the human eye location algorithm adopts and locatees human eye based on the method for face feature.Smoothing processing is at first done to facial image by system on the basis that detects people's face, then this image is done horizontal Gray Projection.According to the face feature of people's face, can find that there are a plurality of local smallest point in this grey scale curve, is respectively eyebrow, eyes, nostril and face.After so system at first is assumed to position of human eye with second local smallest point of grey scale curve, this point respectively get image 1/20 up and down as candidate's eye zone, and confirm according to following feature.(1) by the position of human eye as can be known, this part smallest point must be positioned at the first half of people's face.(2) because eyebrow is littler to the distance in nostril than eyes to the distance of eyes, therefore obtain second local smallest point of image after, the distance of itself and previous local smallest point will be less than the distance of itself and a back local smallest point.(3) horizontal projection is done in the candidate's eye zone to obtaining, and drop shadow curve need satisfy a crest shape, during two troughs.Have only and satisfy above three conditions and just can be defined as human eye area, otherwise adjust candidate's eye zone.After confirming as human eye area, and with this regional mid point as people's eye coordinates.
In the present embodiment, the hand detection algorithm adopts the method for colour of skin coupling tentatively to cut apart the zone of selling.On the basis that detects people's face, at first get human eye below size and be 20 * 20 rectangle frame sample area of skin color as people's face, calculate Y, Cb, the Cr mean value of 400 pixels in this rectangle frame.With the Y in the sample area of skin color, Cb, Cr mean value is the center, and plus-minus 10 mates with each pixel in the image respectively as last lower threshold value.Satisfy the zone of this two field picture complexion model, promptly be judged as skin pixels, after coupling is finished, can tentatively cut apart the zone of selling.According to detected human face region, the interference that can remove people's face; According to the geometry site of people's face and neck, get under people's face, wide for people's face twice, height and people's appearance rectangle together is a neck area, can the interference of removing neck under the situation of deflection be arranged at people's face.Because the area of hand is much larger than the area in background interference zone, so can judge whether to be background interference according to connected domain area size, area will be regarded as background interference and remove the zone thereby detection is sold less than the connected region of threshold value.
In the present embodiment, the finger tip location algorithm is the feature location fingertip area according to the handwheel exterior feature.System at first utilizes gradient operator adversary area image to carry out rim detection, gets profile in one's hands.Adopt grid sampling adversary profile to handle then, promptly one 10 * 10 zone is only represented with a pixel in the original image.Each sampled pixel with the handwheel exterior feature after the sampling is the center, chooses 4 adjacent pixels in counterclockwise and clockwise direction respectively, and constituting with sampled pixel is that centrosymmetric four pairs of pixels are right.Calculate the distance variance between every pair of pixel respectively, the mean distance variance is minimum and be fingertip area less than the sampled pixel of threshold value.After obtaining fingertip area, with this regional mid point as the finger tip coordinate.
In the present embodiment, it is as follows to determine that by the coordinate of human eye in the image and finger tip the user points on the screen method of position:
(1) be that true origin is set up a three-dimensional coordinate system with the camera.
(2) after system detects human face region and hand zone in the image, according to the area in human face region and hand zone, calculate human eye and finger tip to screen apart from d
EAnd d
F
(3) by human eye location and finger tip location, available human eye and the finger tip coordinate in image is respectively (x
E_image, y
E_image), (x
F_image, y
F_image).In order to determine that the user points to the position of screen by the line of human eye and finger tip, need be with human eye and finger tip the coordinate transformation in image for the camera being the coordinate in the three-dimensional coordinate system of initial point.Fig. 4 a and Fig. 4 b be respectively human eye and screen in the three-dimensional coordinate system that with the camera is initial point X-Z and the location diagram in the Y-Z plane.Because there is proportionate relationship in coordinate and the human eye of human eye in image at the coordinate of this three-dimensional coordinate system, so human eye is as follows at the coordinate of this three-dimensional coordinate system:
x
E=g(d
E)×(L/2-x
E_image)
G (d wherein
E) be the coefficient relevant with distance, L is the width of images acquired, and θ is the angle of inclination of camera, and W is the height of images acquired.In like manner, can obtain the coordinate x of finger tip by aforementioned calculation
F, y
F
(4) after system obtains human eye and the coordinate of finger tip at this three-dimensional coordinate system, will calculate the position that the user points to the screen position according to human eye and 2 definite straight lines of finger tip.Fig. 5 is the coordinate setting model that the coordinate in this three-dimensional coordinate system is determined the screen position according to human eye and finger tip, and according to the proportionate relationship of this model, the coordinate that can get finger tip sensing screen position is as follows:
When having determined the user behind the position of pointing on the screen,, can calculate the logical coordinates of mouse on screen according to the size of screen.
In the present embodiment, gesture identification is the clicking operation of coming analog mouse by the click action that detects user's finger.Point to screen and when mobile, will be considered as mouse moving when the user stretches out one on right hand finger.When user's right finger closes for the first time, will be considered as pressing left mouse button; This moment is mobile as if right finger is stretched out, and will be considered as pinning the left button rolling mouse; When right finger closes once more, will be considered as having discharged left button.Point to screen when the user stretches out finger of left hand, when then left-hand finger being closed, will be considered as pressing right mouse button; After left-hand finger is stretched out, will be considered as discharging right mouse button.
Claims (8)
1, based on the man-machine interaction method of eye tracking and gesture identification, comprises step: the detection of (1) human face region; (2) detection in hand zone; (3) location of the location of human eye and (4) finger tip is characterized in that also comprising the steps:
(5) screen location: when the user stretched out a finger and points to screen, facial image that system collects according to image collecting device and hand image-region area size calculated human eye and the finger tip distance to screen; Coordinate conversion in image is human eye and the coordinate of finger tip in the three-dimensional coordinate system that is changed to initial point with image collector with human eye and finger tip; By human eye and 2 definite straight lines of finger tip, the point that this straight line and screen intersect is exactly the position that the user points to screen, calculates the coordinate of this position in described three-dimensional coordinate system according to the proportionate relationship of people's eye coordinates and finger tip coordinate; According to the size of screen, be the logical coordinates of mouse on screen with the coordinate transformation of described position;
(6) gesture identification:, point to screen and when mobile, will be considered as mouse moving when the user stretches out finger of the right hand by the clicking operation of judging that the finger click action is come analog mouse; When this right finger closes for the first time, will be considered as pressing left mouse button; To be considered as pin left button rolling mouse if right finger is stretched out sensing screen and mobile this moment; When this right finger closes once more, will be considered as having discharged left button; Point to screen when the user stretches out finger of left hand, when then left-hand finger being closed, will be considered as pressing right mouse button; After left-hand finger is stretched out, will be considered as discharging right mouse button.
2, method according to claim 1, it is characterized in that described step (1) comprising: whether have people's face to exist in the image by judging based on Adaboost people's face detection algorithm of class rectangular characteristic, the integrogram of computed image at first, extract the class rectangular characteristic, according to the sorter feature database that has trained, the method for utilization Cascade cascade is searched for human face region in image.
3, method according to claim 2 is characterized in that the training step of described sorter feature database comprises: calculate the integrogram of sample image, extract the class rectangular characteristic of sample image; Screen effective feature according to the Adaboost algorithm, constitute Weak Classifier; By making up a plurality of Weak Classifiers, constitute strong classifier; The a plurality of strong classifiers of cascade form the sorter feature database that people's face detects.
4, method according to claim 1 is characterized in that described step (2) comprising: according to the features of skin colors of people's face, by the zone that is complementary in the colour of skin matching process searching image; After tentatively cutting apart the zone of selling,, remove background interference according to the area size of connected domain, thereby detect the zone of selling according to the position removal people's face of human face region and the interference region of neck.
5, method according to claim 1, it is characterized in that described step (3) comprising: detecting on the basis of human face region, facial image is done horizontal Gray Projection, according to the face feature of people's face, on horizontal Gray Projection curve, search for each local smallest point and judge whether to be human eye area then; After detecting human eye area, with this regional mid point as people's eye coordinates.
6, method according to claim 1, it is characterized in that described step (4) comprising: at first the image in adversary zone carries out rim detection and grid sample process; Each pixel with the handwheel exterior feature after the sampling is the center, and it is right to choose four pairs of pixels of 4 adjacent pixels formations in counterclockwise and clockwise direction respectively; Calculate the distance variance between every pair of pixel respectively, the mean distance variance is minimum and be fingertip area less than the sampled pixel of threshold value; After detecting fingertip area, with this regional mid point as the finger tip coordinate.
7, a kind of device of realizing each described method of claim 1~5, it is characterized in that comprising image capture module, image processing module and wireless transport module, camera in the image capture module is positioned over screen upper end central authorities, is responsible for gathering user's image and being input in the image processing module; Image processing module is responsible for controlling other two modules, moves various image processing algorithms the user images of gathering is carried out analyzing and processing, the user is pointed to the position of screen and the variation of gesture is converted into logical coordinates and the steering order of computing machine on screen; Wireless transport module comprises receiver module and sending module, and sending module is connected with image processing module, is responsible for result is arrived receiver module by radio signal transmission; Receiver module links to each other with computing machine, is responsible for that result is converted into the mouse control signal and is input in the computing machine.
8, device according to claim 6, it is characterized in that described image capture module comprises a camera, described image processing module comprises a flush bonding processor and peripheral components, and described receiver module and sending module all include radio frequency chip and single-chip microcomputer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2008100301944A CN101344816B (en) | 2008-08-15 | 2008-08-15 | Human-machine interaction method and device based on sight tracing and gesture discriminating |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2008100301944A CN101344816B (en) | 2008-08-15 | 2008-08-15 | Human-machine interaction method and device based on sight tracing and gesture discriminating |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101344816A true CN101344816A (en) | 2009-01-14 |
CN101344816B CN101344816B (en) | 2010-08-11 |
Family
ID=40246828
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2008100301944A Expired - Fee Related CN101344816B (en) | 2008-08-15 | 2008-08-15 | Human-machine interaction method and device based on sight tracing and gesture discriminating |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101344816B (en) |
Cited By (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101783865A (en) * | 2010-02-26 | 2010-07-21 | 中山大学 | Digital set-top box and intelligent mouse control method based on same |
CN101799717A (en) * | 2010-03-05 | 2010-08-11 | 天津大学 | Man-machine interaction method based on hand action catch |
CN101813976A (en) * | 2010-03-09 | 2010-08-25 | 华南理工大学 | Sighting tracking man-computer interaction method and device based on SOC (System On Chip) |
CN101872244A (en) * | 2010-06-25 | 2010-10-27 | 中国科学院软件研究所 | Method for human-computer interaction based on hand movement and color information of user |
CN101901339A (en) * | 2010-07-30 | 2010-12-01 | 华南理工大学 | Hand movement detecting method |
CN101923433A (en) * | 2010-08-17 | 2010-12-22 | 北京航空航天大学 | Man-computer interaction mode based on hand shadow identification |
CN102012778A (en) * | 2009-09-04 | 2011-04-13 | 索尼公司 | Display control apparatus, display control method, and display control program |
CN102081503A (en) * | 2011-01-25 | 2011-06-01 | 汉王科技股份有限公司 | Electronic reader capable of automatically turning pages based on eye tracking and method thereof |
WO2011079640A1 (en) * | 2009-12-29 | 2011-07-07 | Hu Shixi | Method for determining whether target point belongs to flat plane or not, mouse and touch screen |
CN102142084A (en) * | 2011-05-06 | 2011-08-03 | 北京网尚数字电影院线有限公司 | Method for gesture recognition |
CN101719015B (en) * | 2009-11-03 | 2011-08-31 | 上海大学 | Method for positioning finger tips of directed gestures |
CN101694692B (en) * | 2009-10-22 | 2011-09-07 | 浙江大学 | Gesture identification method based on acceleration transducer |
CN102184021A (en) * | 2011-05-27 | 2011-09-14 | 华南理工大学 | Television man-machine interaction method based on handwriting input and fingertip mouse |
CN102192173A (en) * | 2010-03-08 | 2011-09-21 | 艾美特电器(深圳)有限公司 | Intelligent gesture control electric fan |
CN102200834A (en) * | 2011-05-26 | 2011-09-28 | 华南理工大学 | television control-oriented finger-mouse interaction method |
CN102222342A (en) * | 2010-04-16 | 2011-10-19 | 上海摩比源软件技术有限公司 | Tracking method of human body motions and identification method thereof |
WO2011127646A1 (en) * | 2010-04-13 | 2011-10-20 | Nokia Corporation | An apparatus, method, computer program and user interface |
CN102270275A (en) * | 2010-06-04 | 2011-12-07 | 汤姆森特许公司 | Method for selection of an object in a virtual environment |
CN102270035A (en) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | Apparatus and method for selecting and operating object in non-touch mode |
CN102279669A (en) * | 2010-06-11 | 2011-12-14 | 游森溢 | Input system for using display image |
WO2012034469A1 (en) * | 2010-09-17 | 2012-03-22 | 腾讯科技(深圳)有限公司 | Gesture-based human-computer interaction method and system, and computer storage media |
CN102402289A (en) * | 2011-11-22 | 2012-04-04 | 华南理工大学 | Mouse recognition method for gesture based on machine vision |
CN102411477A (en) * | 2011-11-16 | 2012-04-11 | 鸿富锦精密工业(深圳)有限公司 | Electronic equipment and text reading guide method thereof |
CN102411478A (en) * | 2011-11-16 | 2012-04-11 | 鸿富锦精密工业(深圳)有限公司 | Electronic device and text guiding method therefor |
CN102457607A (en) * | 2010-10-20 | 2012-05-16 | 浪潮乐金数字移动通信有限公司 | Image sensing input mobile communication terminal and image sensing input method thereof |
CN102467234A (en) * | 2010-11-12 | 2012-05-23 | Lg电子株式会社 | Method for providing display image in multimedia device and multimedia device thereof |
CN102467236A (en) * | 2010-11-17 | 2012-05-23 | 夏普株式会社 | Instruction accepting apparatus and instruction accepting method |
RU2455676C2 (en) * | 2011-07-04 | 2012-07-10 | Общество с ограниченной ответственностью "ТРИДИВИ" | Method of controlling device using gestures and 3d sensor for realising said method |
CN102592115A (en) * | 2011-12-26 | 2012-07-18 | Tcl集团股份有限公司 | Hand positioning method and system |
CN102591451A (en) * | 2010-12-27 | 2012-07-18 | 日立民用电子株式会社 | Image processing device and image display device |
CN102622081A (en) * | 2011-01-30 | 2012-08-01 | 北京新岸线网络技术有限公司 | Method and system for realizing somatic sensory interaction |
CN102662464A (en) * | 2012-03-26 | 2012-09-12 | 华南理工大学 | Gesture control method of gesture roaming control system |
CN102710908A (en) * | 2012-05-31 | 2012-10-03 | 无锡商业职业技术学院 | Device for controlling television based on gesture |
CN102761684A (en) * | 2011-04-28 | 2012-10-31 | 奥林巴斯映像株式会社 | An image capturing device and an image data recording method |
CN102981742A (en) * | 2012-11-28 | 2013-03-20 | 无锡市爱福瑞科技发展有限公司 | Gesture interaction system based on computer visions |
WO2013056431A1 (en) * | 2011-10-18 | 2013-04-25 | Nokia Corporation | Methods and apparatuses for gesture recognition |
CN103092334A (en) * | 2011-10-31 | 2013-05-08 | 财团法人资讯工业策进会 | Virtual mouse driving device and virtual mouse simulation method |
CN103154858A (en) * | 2010-09-22 | 2013-06-12 | 岛根县 | Operation input apparatus, operation input method, and program |
WO2013082760A1 (en) * | 2011-12-06 | 2013-06-13 | Thomson Licensing | Method and system for responding to user's selection gesture of object displayed in three dimensions |
CN103164022A (en) * | 2011-12-16 | 2013-06-19 | 国际商业机器公司 | Multi-finger touch method, device and portable type terminal device |
CN103177373A (en) * | 2011-12-22 | 2013-06-26 | 上海易络客网络技术有限公司 | Interactive commercial experiencing method and system |
CN103218105A (en) * | 2012-01-19 | 2013-07-24 | 联想(北京)有限公司 | Processing method and system for electronic equipment, and electronic equipment |
CN103294173A (en) * | 2012-02-24 | 2013-09-11 | 冠捷投资有限公司 | Remote control system based on user actions and method thereof |
CN103309446A (en) * | 2013-05-30 | 2013-09-18 | 上海交通大学 | Virtual data acquisition and transmission system taking both hands of humankind as carrier |
CN103442177A (en) * | 2013-08-30 | 2013-12-11 | 程治永 | PTZ video camera control system and method based on gesture identification |
CN103488299A (en) * | 2013-10-15 | 2014-01-01 | 大连市恒芯科技有限公司 | Intelligent terminal man-machine interaction method fusing human face and gestures |
CN103502912A (en) * | 2011-05-09 | 2014-01-08 | 皇家飞利浦有限公司 | Rotating an object on a screen |
CN103513906A (en) * | 2012-06-28 | 2014-01-15 | 联想(北京)有限公司 | Order identifying method and device and electronic device |
CN103529929A (en) * | 2012-07-06 | 2014-01-22 | 原相科技股份有限公司 | Gesture recognition system and glasses capable of recognizing gesture actions |
CN103577075A (en) * | 2013-11-11 | 2014-02-12 | 惠州Tcl移动通信有限公司 | Parameter adjusting method and device of electronic equipment |
CN103616954A (en) * | 2013-12-06 | 2014-03-05 | Tcl通讯(宁波)有限公司 | Virtual keyboard system, implementation method and mobile terminal |
CN103690146A (en) * | 2013-12-13 | 2014-04-02 | 重庆大学 | Novel eye tracker |
CN103761508A (en) * | 2014-01-02 | 2014-04-30 | 大连理工大学 | Biological recognition method and system combining face and gestures |
CN103777744A (en) * | 2012-10-23 | 2014-05-07 | 中国移动通信集团公司 | Method and device for achieving input control and mobile terminal |
CN103793060A (en) * | 2014-02-14 | 2014-05-14 | 杨智 | User interaction system and method |
CN103853339A (en) * | 2012-12-03 | 2014-06-11 | 广达电脑股份有限公司 | Input device and electronic device |
CN103888820A (en) * | 2014-03-28 | 2014-06-25 | 湖南网圣腾飞信息技术有限公司 | Game control system and method with smart mobile phone and intelligent television combined |
CN103914146A (en) * | 2013-01-08 | 2014-07-09 | 英飞凌科技股份有限公司 | Method for controlling control parameter through gestures |
CN104090663A (en) * | 2014-07-14 | 2014-10-08 | 济南大学 | Gesture interaction method based on visual attention model |
CN104142741A (en) * | 2013-05-08 | 2014-11-12 | 宏碁股份有限公司 | Electronic device and touch control detecting method thereof |
CN104145232A (en) * | 2012-01-04 | 2014-11-12 | 托比技术股份公司 | System for gaze interaction |
WO2014186972A1 (en) * | 2013-05-24 | 2014-11-27 | Thomson Licensing | Method and apparatus for rendering object for multiple 3d displays |
CN104181838A (en) * | 2014-08-07 | 2014-12-03 | 重庆电子工程职业学院 | Gesture control glass window cleaning apparatus based on Internet of Things technology |
US8963836B2 (en) | 2010-09-17 | 2015-02-24 | Tencent Technology (Shenzhen) Company Limited | Method and system for gesture-based human-machine interaction and computer-readable medium thereof |
CN104541232A (en) * | 2012-09-28 | 2015-04-22 | 英特尔公司 | Multi-modal touch screen emulator |
CN104662587A (en) * | 2012-07-27 | 2015-05-27 | 日本电气方案创新株式会社 | Three-dimensional user-interface device, and three-dimensional operation method |
CN104680123A (en) * | 2013-11-26 | 2015-06-03 | 富士通株式会社 | Object identification device, object identification method and program |
CN104714639A (en) * | 2014-12-30 | 2015-06-17 | 上海孩子国科教设备有限公司 | Trans-space operation method and client side |
CN104866081A (en) * | 2014-02-25 | 2015-08-26 | 中兴通讯股份有限公司 | Terminal operation method and device as well as terminal |
CN104932668A (en) * | 2014-03-20 | 2015-09-23 | 冠捷投资有限公司 | Play content driving device and method for display system |
CN105338390A (en) * | 2015-12-09 | 2016-02-17 | 陈国铭 | Intelligent television control system |
CN105491426A (en) * | 2014-09-16 | 2016-04-13 | 洪永川 | Gesture remote control television system |
CN105630142A (en) * | 2014-11-07 | 2016-06-01 | 中兴通讯股份有限公司 | Method and device for publishing and transferring identification information, and information identification system |
CN105929954A (en) * | 2016-04-19 | 2016-09-07 | 京东方科技集团股份有限公司 | Cursor control method and apparatus as well as display device |
CN106022211A (en) * | 2016-05-04 | 2016-10-12 | 北京航空航天大学 | Method using gestures to control multimedia device |
CN106020478A (en) * | 2016-05-20 | 2016-10-12 | 青岛海信电器股份有限公司 | Intelligent terminal manipulation method, intelligent terminal manipulation apparatus and intelligent terminal |
CN106569600A (en) * | 2016-10-31 | 2017-04-19 | 邯郸美的制冷设备有限公司 | Gesture verification method and device for controlling air conditioners |
WO2017084319A1 (en) * | 2015-11-18 | 2017-05-26 | 乐视控股(北京)有限公司 | Gesture recognition method and virtual reality display output device |
CN106778597A (en) * | 2016-12-12 | 2017-05-31 | 朱明� | Intellectual vision measurer based on graphical analysis |
CN107015636A (en) * | 2016-10-27 | 2017-08-04 | 蔚来汽车有限公司 | The aobvious equipment gestural control method of virtual reality |
CN107193373A (en) * | 2012-09-03 | 2017-09-22 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN107272899A (en) * | 2017-06-21 | 2017-10-20 | 北京奇艺世纪科技有限公司 | A kind of VR exchange methods, device and electronic equipment based on dynamic gesture |
DE102016124906A1 (en) * | 2016-12-20 | 2017-11-30 | Miele & Cie. Kg | Method of controlling a floor care appliance and floor care appliance |
CN107463257A (en) * | 2017-08-03 | 2017-12-12 | 微景天下(北京)科技有限公司 | A kind of man-machine interaction method and device of Virtual Reality system |
CN107491755A (en) * | 2017-08-16 | 2017-12-19 | 京东方科技集团股份有限公司 | Method and device for gesture identification |
CN107533416A (en) * | 2015-03-20 | 2018-01-02 | 株式会社理光 | Display device, display control method, display control program and display system |
CN107741784A (en) * | 2017-10-09 | 2018-02-27 | 济南大学 | A kind of amusement exchange method suitable for leaden paralysis patient |
CN108027886A (en) * | 2015-09-23 | 2018-05-11 | 高通股份有限公司 | Use the system and method for increment object detection of dual threshold local binary pattern operator |
CN108108024A (en) * | 2018-01-02 | 2018-06-01 | 京东方科技集团股份有限公司 | Dynamic gesture acquisition methods and device, display device |
CN108369630A (en) * | 2015-05-28 | 2018-08-03 | 视觉移动科技有限公司 | Gestural control system and method for smart home |
CN108845668A (en) * | 2012-11-07 | 2018-11-20 | 北京三星通信技术研究有限公司 | Man-machine interactive system and method |
US10168773B2 (en) | 2014-07-15 | 2019-01-01 | Huawei Technologies Co., Ltd. | Position locating method and apparatus |
CN109190516A (en) * | 2018-08-14 | 2019-01-11 | 东北大学 | A kind of static gesture identification method based on volar edge contour vectorization |
CN109542219A (en) * | 2018-10-22 | 2019-03-29 | 广东精标科技股份有限公司 | A kind of gesture interaction system and method applied to smart classroom |
CN109597489A (en) * | 2018-12-27 | 2019-04-09 | 武汉市天蝎科技有限公司 | A kind of method and system of the eye movement tracking interaction of near-eye display device |
CN109683719A (en) * | 2019-01-30 | 2019-04-26 | 华南理工大学 | A kind of visual projection's exchange method based on YOLOv3 |
CN109917921A (en) * | 2019-03-28 | 2019-06-21 | 长春光华学院 | It is a kind of for the field VR every empty gesture identification method |
CN110297540A (en) * | 2019-06-12 | 2019-10-01 | 浩博泰德(北京)科技有限公司 | A kind of human-computer interaction device and man-machine interaction method |
US10466797B2 (en) | 2014-04-03 | 2019-11-05 | Huawei Technologies Co., Ltd. | Pointing interaction method, apparatus, and system |
CN110853073A (en) * | 2018-07-25 | 2020-02-28 | 北京三星通信技术研究有限公司 | Method, device, equipment and system for determining attention point and information processing method |
CN111527468A (en) * | 2019-11-18 | 2020-08-11 | 华为技术有限公司 | Air-to-air interaction method, device and equipment |
WO2020183249A1 (en) * | 2019-03-08 | 2020-09-17 | Indian Institute Of Science | A system for man-machine interaction in vehicles |
CN112363626A (en) * | 2020-11-25 | 2021-02-12 | 广州魅视电子科技有限公司 | Large screen interaction control method based on human body posture and gesture posture visual recognition |
CN112667078A (en) * | 2020-12-24 | 2021-04-16 | 西安电子科技大学 | Method and system for quickly controlling mouse in multi-screen scene based on sight estimation and computer readable medium |
CN113515190A (en) * | 2021-05-06 | 2021-10-19 | 广东魅视科技股份有限公司 | Mouse function implementation method based on human body gestures |
WO2022027435A1 (en) * | 2020-08-06 | 2022-02-10 | Huawei Technologies Co., Ltd. | Activating cross-device interaction with pointing gesture recognition |
WO2023004553A1 (en) * | 2021-07-26 | 2023-02-02 | 广州视源电子科技股份有限公司 | Method and system for implementing fingertip mouse |
CN114967927B (en) * | 2022-05-30 | 2024-04-16 | 桂林电子科技大学 | Intelligent gesture interaction method based on image processing |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102520794B (en) * | 2011-12-06 | 2014-11-19 | 华映视讯(吴江)有限公司 | Gesture recognition system and method |
-
2008
- 2008-08-15 CN CN2008100301944A patent/CN101344816B/en not_active Expired - Fee Related
Cited By (156)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102012778A (en) * | 2009-09-04 | 2011-04-13 | 索尼公司 | Display control apparatus, display control method, and display control program |
CN101694692B (en) * | 2009-10-22 | 2011-09-07 | 浙江大学 | Gesture identification method based on acceleration transducer |
CN101719015B (en) * | 2009-11-03 | 2011-08-31 | 上海大学 | Method for positioning finger tips of directed gestures |
WO2011079640A1 (en) * | 2009-12-29 | 2011-07-07 | Hu Shixi | Method for determining whether target point belongs to flat plane or not, mouse and touch screen |
CN101783865A (en) * | 2010-02-26 | 2010-07-21 | 中山大学 | Digital set-top box and intelligent mouse control method based on same |
CN101799717A (en) * | 2010-03-05 | 2010-08-11 | 天津大学 | Man-machine interaction method based on hand action catch |
CN102192173A (en) * | 2010-03-08 | 2011-09-21 | 艾美特电器(深圳)有限公司 | Intelligent gesture control electric fan |
CN101813976A (en) * | 2010-03-09 | 2010-08-25 | 华南理工大学 | Sighting tracking man-computer interaction method and device based on SOC (System On Chip) |
WO2011127646A1 (en) * | 2010-04-13 | 2011-10-20 | Nokia Corporation | An apparatus, method, computer program and user interface |
US9535493B2 (en) | 2010-04-13 | 2017-01-03 | Nokia Technologies Oy | Apparatus, method, computer program and user interface |
CN102222342A (en) * | 2010-04-16 | 2011-10-19 | 上海摩比源软件技术有限公司 | Tracking method of human body motions and identification method thereof |
CN102270035A (en) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | Apparatus and method for selecting and operating object in non-touch mode |
CN102270275B (en) * | 2010-06-04 | 2017-03-01 | 汤姆森特许公司 | The method of selecting object and multimedia terminal in virtual environment |
US9298346B2 (en) | 2010-06-04 | 2016-03-29 | Thomson Licensing | Method for selection of an object in a virtual environment |
CN102270275A (en) * | 2010-06-04 | 2011-12-07 | 汤姆森特许公司 | Method for selection of an object in a virtual environment |
CN102279669A (en) * | 2010-06-11 | 2011-12-14 | 游森溢 | Input system for using display image |
CN101872244A (en) * | 2010-06-25 | 2010-10-27 | 中国科学院软件研究所 | Method for human-computer interaction based on hand movement and color information of user |
CN101901339A (en) * | 2010-07-30 | 2010-12-01 | 华南理工大学 | Hand movement detecting method |
CN101923433A (en) * | 2010-08-17 | 2010-12-22 | 北京航空航天大学 | Man-computer interaction mode based on hand shadow identification |
WO2012034469A1 (en) * | 2010-09-17 | 2012-03-22 | 腾讯科技(深圳)有限公司 | Gesture-based human-computer interaction method and system, and computer storage media |
CN102402279A (en) * | 2010-09-17 | 2012-04-04 | 腾讯科技(深圳)有限公司 | Human-computer interaction method and system based on gestures |
CN102402279B (en) * | 2010-09-17 | 2016-05-25 | 腾讯科技(深圳)有限公司 | Man-machine interaction method based on gesture and system |
US8963836B2 (en) | 2010-09-17 | 2015-02-24 | Tencent Technology (Shenzhen) Company Limited | Method and system for gesture-based human-machine interaction and computer-readable medium thereof |
CN103154858A (en) * | 2010-09-22 | 2013-06-12 | 岛根县 | Operation input apparatus, operation input method, and program |
CN103154858B (en) * | 2010-09-22 | 2016-03-30 | 岛根县 | Input device and method and program |
CN102457607B (en) * | 2010-10-20 | 2016-05-04 | 浪潮乐金数字移动通信有限公司 | A kind of video sensing input mobile communication terminal and video sensing input method thereof |
CN102457607A (en) * | 2010-10-20 | 2012-05-16 | 浪潮乐金数字移动通信有限公司 | Image sensing input mobile communication terminal and image sensing input method thereof |
CN102467234A (en) * | 2010-11-12 | 2012-05-23 | Lg电子株式会社 | Method for providing display image in multimedia device and multimedia device thereof |
CN102467236A (en) * | 2010-11-17 | 2012-05-23 | 夏普株式会社 | Instruction accepting apparatus and instruction accepting method |
US9746931B2 (en) | 2010-12-27 | 2017-08-29 | Hitachi Maxell, Ltd. | Image processing device and image display device |
CN102591451A (en) * | 2010-12-27 | 2012-07-18 | 日立民用电子株式会社 | Image processing device and image display device |
CN102591451B (en) * | 2010-12-27 | 2015-12-02 | 日立麦克赛尔株式会社 | Image processor and image display |
US9086726B2 (en) | 2010-12-27 | 2015-07-21 | Hitachi Maxell, Ltd. | Image processing device and image display device |
CN102081503A (en) * | 2011-01-25 | 2011-06-01 | 汉王科技股份有限公司 | Electronic reader capable of automatically turning pages based on eye tracking and method thereof |
CN102622081A (en) * | 2011-01-30 | 2012-08-01 | 北京新岸线网络技术有限公司 | Method and system for realizing somatic sensory interaction |
CN102622081B (en) * | 2011-01-30 | 2016-06-08 | 北京新岸线移动多媒体技术有限公司 | A kind of realize the mutual method of body sense and system |
CN102761684B (en) * | 2011-04-28 | 2015-08-05 | 奥林巴斯映像株式会社 | The recording method of photographic equipment and view data |
CN102761684A (en) * | 2011-04-28 | 2012-10-31 | 奥林巴斯映像株式会社 | An image capturing device and an image data recording method |
CN102142084A (en) * | 2011-05-06 | 2011-08-03 | 北京网尚数字电影院线有限公司 | Method for gesture recognition |
CN102142084B (en) * | 2011-05-06 | 2012-12-26 | 北京网尚数字电影院线有限公司 | Method for gesture recognition |
CN103502912B (en) * | 2011-05-09 | 2017-11-07 | 皇家飞利浦有限公司 | Object on rotating screen |
CN103502912A (en) * | 2011-05-09 | 2014-01-08 | 皇家飞利浦有限公司 | Rotating an object on a screen |
CN102200834B (en) * | 2011-05-26 | 2012-10-31 | 华南理工大学 | Television control-oriented finger-mouse interaction method |
CN102200834A (en) * | 2011-05-26 | 2011-09-28 | 华南理工大学 | television control-oriented finger-mouse interaction method |
CN102184021A (en) * | 2011-05-27 | 2011-09-14 | 华南理工大学 | Television man-machine interaction method based on handwriting input and fingertip mouse |
US8823642B2 (en) | 2011-07-04 | 2014-09-02 | 3Divi Company | Methods and systems for controlling devices using gestures and related 3D sensor |
RU2455676C2 (en) * | 2011-07-04 | 2012-07-10 | Общество с ограниченной ответственностью "ТРИДИВИ" | Method of controlling device using gestures and 3d sensor for realising said method |
US8896522B2 (en) | 2011-07-04 | 2014-11-25 | 3Divi Company | User-centric three-dimensional interactive control environment |
US9251409B2 (en) | 2011-10-18 | 2016-02-02 | Nokia Technologies Oy | Methods and apparatuses for gesture recognition |
WO2013056431A1 (en) * | 2011-10-18 | 2013-04-25 | Nokia Corporation | Methods and apparatuses for gesture recognition |
CN103092334A (en) * | 2011-10-31 | 2013-05-08 | 财团法人资讯工业策进会 | Virtual mouse driving device and virtual mouse simulation method |
CN102411478A (en) * | 2011-11-16 | 2012-04-11 | 鸿富锦精密工业(深圳)有限公司 | Electronic device and text guiding method therefor |
CN102411477A (en) * | 2011-11-16 | 2012-04-11 | 鸿富锦精密工业(深圳)有限公司 | Electronic equipment and text reading guide method thereof |
CN102402289A (en) * | 2011-11-22 | 2012-04-04 | 华南理工大学 | Mouse recognition method for gesture based on machine vision |
CN102402289B (en) * | 2011-11-22 | 2014-09-10 | 华南理工大学 | Mouse recognition method for gesture based on machine vision |
CN103999018B (en) * | 2011-12-06 | 2016-12-28 | 汤姆逊许可公司 | The user of response three-dimensional display object selects the method and system of posture |
WO2013082760A1 (en) * | 2011-12-06 | 2013-06-13 | Thomson Licensing | Method and system for responding to user's selection gesture of object displayed in three dimensions |
CN103164022A (en) * | 2011-12-16 | 2013-06-19 | 国际商业机器公司 | Multi-finger touch method, device and portable type terminal device |
CN103164022B (en) * | 2011-12-16 | 2016-03-16 | 国际商业机器公司 | Many fingers touch method and device, portable terminal |
CN103177373A (en) * | 2011-12-22 | 2013-06-26 | 上海易络客网络技术有限公司 | Interactive commercial experiencing method and system |
CN102592115A (en) * | 2011-12-26 | 2012-07-18 | Tcl集团股份有限公司 | Hand positioning method and system |
CN107368191A (en) * | 2012-01-04 | 2017-11-21 | 托比股份公司 | For watching interactive system attentively |
CN107368191B (en) * | 2012-01-04 | 2020-09-25 | 托比股份公司 | System for gaze interaction |
CN104145232B (en) * | 2012-01-04 | 2017-09-22 | 托比股份公司 | The interactive system for watching attentively |
CN104145232A (en) * | 2012-01-04 | 2014-11-12 | 托比技术股份公司 | System for gaze interaction |
CN103218105B (en) * | 2012-01-19 | 2016-07-06 | 联想(北京)有限公司 | The processing method of electronic equipment, system and electronic equipment |
CN103218105A (en) * | 2012-01-19 | 2013-07-24 | 联想(北京)有限公司 | Processing method and system for electronic equipment, and electronic equipment |
CN103294173A (en) * | 2012-02-24 | 2013-09-11 | 冠捷投资有限公司 | Remote control system based on user actions and method thereof |
CN102662464A (en) * | 2012-03-26 | 2012-09-12 | 华南理工大学 | Gesture control method of gesture roaming control system |
CN102710908A (en) * | 2012-05-31 | 2012-10-03 | 无锡商业职业技术学院 | Device for controlling television based on gesture |
CN103513906B (en) * | 2012-06-28 | 2018-01-16 | 联想(北京)有限公司 | A kind of command identifying method, device and electronic equipment |
CN103513906A (en) * | 2012-06-28 | 2014-01-15 | 联想(北京)有限公司 | Order identifying method and device and electronic device |
CN103529929A (en) * | 2012-07-06 | 2014-01-22 | 原相科技股份有限公司 | Gesture recognition system and glasses capable of recognizing gesture actions |
CN104662587A (en) * | 2012-07-27 | 2015-05-27 | 日本电气方案创新株式会社 | Three-dimensional user-interface device, and three-dimensional operation method |
CN104662587B (en) * | 2012-07-27 | 2017-07-04 | 日本电气方案创新株式会社 | Three-dimensional user interface device and three-dimensional manipulating method |
CN107193373A (en) * | 2012-09-03 | 2017-09-22 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
CN107193373B (en) * | 2012-09-03 | 2020-04-24 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104541232A (en) * | 2012-09-28 | 2015-04-22 | 英特尔公司 | Multi-modal touch screen emulator |
CN103777744A (en) * | 2012-10-23 | 2014-05-07 | 中国移动通信集团公司 | Method and device for achieving input control and mobile terminal |
CN108845668A (en) * | 2012-11-07 | 2018-11-20 | 北京三星通信技术研究有限公司 | Man-machine interactive system and method |
CN102981742A (en) * | 2012-11-28 | 2013-03-20 | 无锡市爱福瑞科技发展有限公司 | Gesture interaction system based on computer visions |
CN103853339A (en) * | 2012-12-03 | 2014-06-11 | 广达电脑股份有限公司 | Input device and electronic device |
CN103914146A (en) * | 2013-01-08 | 2014-07-09 | 英飞凌科技股份有限公司 | Method for controlling control parameter through gestures |
CN103914146B (en) * | 2013-01-08 | 2017-03-01 | 英飞凌科技股份有限公司 | By the control to control parameter for the gesture recognition |
CN104142741A (en) * | 2013-05-08 | 2014-11-12 | 宏碁股份有限公司 | Electronic device and touch control detecting method thereof |
CN104142741B (en) * | 2013-05-08 | 2017-04-12 | 宏碁股份有限公司 | Electronic device and touch control detecting method thereof |
WO2014186972A1 (en) * | 2013-05-24 | 2014-11-27 | Thomson Licensing | Method and apparatus for rendering object for multiple 3d displays |
US10275933B2 (en) | 2013-05-24 | 2019-04-30 | Thomson Licensing | Method and apparatus for rendering object for multiple 3D displays |
CN103309446B (en) * | 2013-05-30 | 2016-03-02 | 上海交通大学 | The virtual data being carrier with mankind's both hands obtains and transmission system |
CN103309446A (en) * | 2013-05-30 | 2013-09-18 | 上海交通大学 | Virtual data acquisition and transmission system taking both hands of humankind as carrier |
CN103442177A (en) * | 2013-08-30 | 2013-12-11 | 程治永 | PTZ video camera control system and method based on gesture identification |
CN103488299B (en) * | 2013-10-15 | 2016-11-23 | 大连市恒芯科技有限公司 | A kind of intelligent terminal man-machine interaction method merging face and gesture |
CN103488299A (en) * | 2013-10-15 | 2014-01-01 | 大连市恒芯科技有限公司 | Intelligent terminal man-machine interaction method fusing human face and gestures |
CN103577075A (en) * | 2013-11-11 | 2014-02-12 | 惠州Tcl移动通信有限公司 | Parameter adjusting method and device of electronic equipment |
CN104680123A (en) * | 2013-11-26 | 2015-06-03 | 富士通株式会社 | Object identification device, object identification method and program |
CN103616954A (en) * | 2013-12-06 | 2014-03-05 | Tcl通讯(宁波)有限公司 | Virtual keyboard system, implementation method and mobile terminal |
CN103690146A (en) * | 2013-12-13 | 2014-04-02 | 重庆大学 | Novel eye tracker |
CN103761508A (en) * | 2014-01-02 | 2014-04-30 | 大连理工大学 | Biological recognition method and system combining face and gestures |
CN103793060A (en) * | 2014-02-14 | 2014-05-14 | 杨智 | User interaction system and method |
CN103793060B (en) * | 2014-02-14 | 2017-07-28 | 杨智 | A kind of user interactive system and method |
WO2015127720A1 (en) * | 2014-02-25 | 2015-09-03 | 中兴通讯股份有限公司 | Terminal operation method, apparatus and terminal |
CN104866081A (en) * | 2014-02-25 | 2015-08-26 | 中兴通讯股份有限公司 | Terminal operation method and device as well as terminal |
CN104932668A (en) * | 2014-03-20 | 2015-09-23 | 冠捷投资有限公司 | Play content driving device and method for display system |
CN103888820A (en) * | 2014-03-28 | 2014-06-25 | 湖南网圣腾飞信息技术有限公司 | Game control system and method with smart mobile phone and intelligent television combined |
US10466797B2 (en) | 2014-04-03 | 2019-11-05 | Huawei Technologies Co., Ltd. | Pointing interaction method, apparatus, and system |
CN104090663B (en) * | 2014-07-14 | 2016-03-23 | 济南大学 | A kind of view-based access control model pays close attention to the gesture interaction method of model |
CN104090663A (en) * | 2014-07-14 | 2014-10-08 | 济南大学 | Gesture interaction method based on visual attention model |
US10168773B2 (en) | 2014-07-15 | 2019-01-01 | Huawei Technologies Co., Ltd. | Position locating method and apparatus |
CN104181838A (en) * | 2014-08-07 | 2014-12-03 | 重庆电子工程职业学院 | Gesture control glass window cleaning apparatus based on Internet of Things technology |
CN105491426A (en) * | 2014-09-16 | 2016-04-13 | 洪永川 | Gesture remote control television system |
CN105630142A (en) * | 2014-11-07 | 2016-06-01 | 中兴通讯股份有限公司 | Method and device for publishing and transferring identification information, and information identification system |
CN104714639B (en) * | 2014-12-30 | 2017-09-29 | 上海孩子国科教设备有限公司 | Across the space method and client operated |
CN104714639A (en) * | 2014-12-30 | 2015-06-17 | 上海孩子国科教设备有限公司 | Trans-space operation method and client side |
CN107533416A (en) * | 2015-03-20 | 2018-01-02 | 株式会社理光 | Display device, display control method, display control program and display system |
CN108369630A (en) * | 2015-05-28 | 2018-08-03 | 视觉移动科技有限公司 | Gestural control system and method for smart home |
CN108027886A (en) * | 2015-09-23 | 2018-05-11 | 高通股份有限公司 | Use the system and method for increment object detection of dual threshold local binary pattern operator |
WO2017084319A1 (en) * | 2015-11-18 | 2017-05-26 | 乐视控股(北京)有限公司 | Gesture recognition method and virtual reality display output device |
CN105338390A (en) * | 2015-12-09 | 2016-02-17 | 陈国铭 | Intelligent television control system |
CN105929954B (en) * | 2016-04-19 | 2019-10-18 | 京东方科技集团股份有限公司 | It is a kind of control cursor method and device, display equipment |
CN105929954A (en) * | 2016-04-19 | 2016-09-07 | 京东方科技集团股份有限公司 | Cursor control method and apparatus as well as display device |
CN106022211B (en) * | 2016-05-04 | 2019-06-28 | 北京航空航天大学 | A method of utilizing gesture control multimedia equipment |
CN106022211A (en) * | 2016-05-04 | 2016-10-12 | 北京航空航天大学 | Method using gestures to control multimedia device |
CN106020478A (en) * | 2016-05-20 | 2016-10-12 | 青岛海信电器股份有限公司 | Intelligent terminal manipulation method, intelligent terminal manipulation apparatus and intelligent terminal |
CN106020478B (en) * | 2016-05-20 | 2019-09-13 | 青岛海信电器股份有限公司 | A kind of intelligent terminal control method, device and intelligent terminal |
WO2018076848A1 (en) * | 2016-10-27 | 2018-05-03 | 蔚来汽车有限公司 | Gesture control method for virtual reality head-mounted display device |
CN107015636A (en) * | 2016-10-27 | 2017-08-04 | 蔚来汽车有限公司 | The aobvious equipment gestural control method of virtual reality |
CN106569600A (en) * | 2016-10-31 | 2017-04-19 | 邯郸美的制冷设备有限公司 | Gesture verification method and device for controlling air conditioners |
CN106778597B (en) * | 2016-12-12 | 2020-04-10 | 朱明� | Intelligent vision detector based on image analysis |
CN106778597A (en) * | 2016-12-12 | 2017-05-31 | 朱明� | Intellectual vision measurer based on graphical analysis |
DE102016124906A1 (en) * | 2016-12-20 | 2017-11-30 | Miele & Cie. Kg | Method of controlling a floor care appliance and floor care appliance |
CN107272899A (en) * | 2017-06-21 | 2017-10-20 | 北京奇艺世纪科技有限公司 | A kind of VR exchange methods, device and electronic equipment based on dynamic gesture |
CN107463257A (en) * | 2017-08-03 | 2017-12-12 | 微景天下(北京)科技有限公司 | A kind of man-machine interaction method and device of Virtual Reality system |
CN107463257B (en) * | 2017-08-03 | 2020-08-21 | 微景天下(北京)科技有限公司 | Human-computer interaction method and device of virtual reality VR system |
CN107491755A (en) * | 2017-08-16 | 2017-12-19 | 京东方科技集团股份有限公司 | Method and device for gesture identification |
CN107491755B (en) * | 2017-08-16 | 2021-04-27 | 京东方科技集团股份有限公司 | Method and device for gesture recognition |
US10509948B2 (en) | 2017-08-16 | 2019-12-17 | Boe Technology Group Co., Ltd. | Method and device for gesture recognition |
CN107741784A (en) * | 2017-10-09 | 2018-02-27 | 济南大学 | A kind of amusement exchange method suitable for leaden paralysis patient |
CN108108024A (en) * | 2018-01-02 | 2018-06-01 | 京东方科技集团股份有限公司 | Dynamic gesture acquisition methods and device, display device |
CN110853073A (en) * | 2018-07-25 | 2020-02-28 | 北京三星通信技术研究有限公司 | Method, device, equipment and system for determining attention point and information processing method |
CN109190516A (en) * | 2018-08-14 | 2019-01-11 | 东北大学 | A kind of static gesture identification method based on volar edge contour vectorization |
CN109542219A (en) * | 2018-10-22 | 2019-03-29 | 广东精标科技股份有限公司 | A kind of gesture interaction system and method applied to smart classroom |
CN109542219B (en) * | 2018-10-22 | 2021-07-30 | 广东精标科技股份有限公司 | Gesture interaction system and method applied to intelligent classroom |
CN109597489A (en) * | 2018-12-27 | 2019-04-09 | 武汉市天蝎科技有限公司 | A kind of method and system of the eye movement tracking interaction of near-eye display device |
CN109683719A (en) * | 2019-01-30 | 2019-04-26 | 华南理工大学 | A kind of visual projection's exchange method based on YOLOv3 |
WO2020183249A1 (en) * | 2019-03-08 | 2020-09-17 | Indian Institute Of Science | A system for man-machine interaction in vehicles |
CN109917921A (en) * | 2019-03-28 | 2019-06-21 | 长春光华学院 | It is a kind of for the field VR every empty gesture identification method |
CN110297540A (en) * | 2019-06-12 | 2019-10-01 | 浩博泰德(北京)科技有限公司 | A kind of human-computer interaction device and man-machine interaction method |
CN111527468A (en) * | 2019-11-18 | 2020-08-11 | 华为技术有限公司 | Air-to-air interaction method, device and equipment |
WO2022027435A1 (en) * | 2020-08-06 | 2022-02-10 | Huawei Technologies Co., Ltd. | Activating cross-device interaction with pointing gesture recognition |
CN112363626A (en) * | 2020-11-25 | 2021-02-12 | 广州魅视电子科技有限公司 | Large screen interaction control method based on human body posture and gesture posture visual recognition |
CN112363626B (en) * | 2020-11-25 | 2021-10-01 | 广东魅视科技股份有限公司 | Large screen interaction control method based on human body posture and gesture posture visual recognition |
CN112667078A (en) * | 2020-12-24 | 2021-04-16 | 西安电子科技大学 | Method and system for quickly controlling mouse in multi-screen scene based on sight estimation and computer readable medium |
CN112667078B (en) * | 2020-12-24 | 2023-06-09 | 西安电子科技大学 | Method, system and computer readable medium for quickly controlling mice in multi-screen scene based on sight estimation |
CN113515190A (en) * | 2021-05-06 | 2021-10-19 | 广东魅视科技股份有限公司 | Mouse function implementation method based on human body gestures |
WO2023004553A1 (en) * | 2021-07-26 | 2023-02-02 | 广州视源电子科技股份有限公司 | Method and system for implementing fingertip mouse |
CN114967927B (en) * | 2022-05-30 | 2024-04-16 | 桂林电子科技大学 | Intelligent gesture interaction method based on image processing |
Also Published As
Publication number | Publication date |
---|---|
CN101344816B (en) | 2010-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101344816B (en) | Human-machine interaction method and device based on sight tracing and gesture discriminating | |
Sagayam et al. | Hand posture and gesture recognition techniques for virtual reality applications: a survey | |
CN107168527B (en) | The first visual angle gesture identification and exchange method based on region convolutional neural networks | |
CN102402289B (en) | Mouse recognition method for gesture based on machine vision | |
CN103150019B (en) | A kind of hand-written input system and method | |
CN102831404A (en) | Method and system for detecting gestures | |
CN108703824B (en) | Bionic hand control system and control method based on myoelectricity bracelet | |
CN101853071A (en) | Gesture identification method and system based on visual sense | |
CN104571482A (en) | Digital device control method based on somatosensory recognition | |
CN111898407B (en) | Human-computer interaction operating system based on human face action recognition | |
CN109800676A (en) | Gesture identification method and system based on depth information | |
CN106886741A (en) | A kind of gesture identification method of base finger identification | |
CN106933340A (en) | Gesture motion recognition methods, control method and device and wrist equipment | |
CN103793056A (en) | Mid-air gesture roaming control method based on distance vector | |
CN109839827B (en) | Gesture recognition intelligent household control system based on full-space position information | |
CN110866468A (en) | Gesture recognition system and method based on passive RFID | |
CN109325408A (en) | A kind of gesture judging method and storage medium | |
Kumar et al. | A hybrid gesture recognition method for American sign language | |
CN103034851A (en) | Device and method of self-learning skin-color model based hand portion tracking | |
Zhang et al. | Robotic control of dynamic and static gesture recognition | |
CN103426000B (en) | A kind of static gesture Fingertip Detection | |
CN211293894U (en) | Hand-written interaction device in air | |
ul Haq et al. | New hand gesture recognition method for mouse operations | |
CN109032355B (en) | Flexible mapping interaction method for corresponding multiple gestures to same interaction command | |
US10095308B2 (en) | Gesture based human machine interface using marker |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20100811 Termination date: 20180815 |
|
CF01 | Termination of patent right due to non-payment of annual fee |