Big data uses storage of huge data with some approaches and techniques to manage and process them. During the past few years the number of persons using internet, email and other internet-based applications has been growing tremendously. Big Data is mainly characterized by 3V’s (Volume, Velocity and, Variety). The Big Data Architecture Framework (BDAF) is proposed to address all aspects of the Big Data Ecosystem. BDAF includes components such as Big Data Infrastructure, Big Data Analytics, Data structures & models, Big Data Lifecycle Management and Big Data Security. Nowadays the volume of data used by the people throughout the world is increasing enormously and exponentially. So, the need for storing, processing and protecting large volume of data has been becoming a great challenge in the modern hyper-connected world. On the basis of work from home concept lot of software professionals are doing their jobs with their internet connected systems for development, implementation, testing and maintenance of various softwares. These professionals and experts are sending and receiving lot of data to various locations to their clients, higher authorities and other officials frequently depending upon their requirements. The traditional data management models are not efficient for today’s exponentially growing data from variety of industries. This challenging task of storing and managing huge volume of data is achieved in Big Data Systems. In this paper we try to give an overview of Big Data Analytics system for storing and processing huge volume of various types of data. Overwhelming the security threats due to various factors like viruses, worms, etc are also great challenges to protect huge volume of data in a big data system.