Gesture recognition is an expressive, alternative means for Human Computer Interaction(HCI), which recently drew signifcant attention after the release of mass consumer applicationsand devices, including gesture{controlled interactive TV systems (iDTV) andadvanced video{game environments. In this work, we propose a complete gesture recognitionframework for continuous streams of static postures and dynamic trajectories ofdigits and letters, targeting both high recognition accuracy and increased computationaleficiency. Special emphasis is given on four fundamental gesture recognition problems,i.e. hand detection and feature extraction, isolated recognition, gesture verification, andgesture spotting on continuous data streams.Specifically, we propose a novel finger detection method, based on geometrical handcontour features (apex detection) and show its importance in hand posture recognition.We then present our approach for isolated recognition, which is based on MaximumCosine Similarity (MCS) and a tree{based fast Nearest Neighbor (fastNN) technique,showing its high recognition accuracy and computational eficiency. Additionally, werelate the computational time required by fastNN for the classification of an unknownquery vector to its Mahalanobis distance and maximum cosine similarity with respectto the set of training examples. This property allows us to perform gesture verification,while it significantly reduces the search time.Finally, we design a complete framework for gesture spotting on continuous streams ofhand data, solving the joint problem of both gesture detection and recognition. Specifi-cally, we model subgesture relationships in a probabilistic way, using both the categoriesand the relative time positions of overlapping gesture candidates. Additionally, we introducea novel metric of ranking conicting gesture candidates, based on their timeduration and cosine similarity score, which oers high conict resolution results forsequences of digits and letters.In all cases, we support our arguments through thorough experiments on real and syntheticgesture datasets, as well as with real{time gesture spotting applications.