Computer-based sketches are geometrically accurate and can be rendered formally. However, existing interfaces of sketching are complex, non-intuitive, and require considerable learning time for novice users. In our work, we aim to develop an intuitive gesture-based sketching interface that provide designers with the freedom of sketching in the air, without touching or wearing any physical device. With our novel sketching system, users could draw letters, symbols, and drawings using non-contact depth-sensing cameras, such as SoftKinetic and Leap Motion. The system records user’s hand trajectory as raw sketch. The sketch is analyzed and beautified to express user’s intent more accurately. Beautification process involves segmenting the sketch into different segments and rebuilding it to form a more beautified sketch. In this process, we use 2/3rd power law which provides novel kinematic features of segments and helps to improve beautification. Our results show encouraging performance for a broad range of writing styles and drawings in the air.