Development of new crop varieties with improved traits through crop breeding programs is one of the promising solutions for the estimated food crisis in 2050 when agricultural production needs to double its current growth rate to feed 10 billion people in the world. Conventional crop breeding strategies largely rely on 'trial-and-error' and human input, limiting the trial capacity of breeding materials, the accuracy of measuring plant traits, and selection intensity of elite genotypes. These limitations have become a bottleneck to boost breeding efficiency as it is expected to meet the food demands. The goal of this research was to develop an integrated and automated high-throughput phenotyping (HTP) framework leveraging advanced technologies in remote sensing and artificial intelligence for estimating key traits and selecting elite genotypes towards improving the selection intensity and accuracy of conventional soybean breeding. To achieve this goal, this research features three objectives: (1) develop an integrated and automated UAV HTP framework for measuring crop traits accurately and efficiently, (2) estimate soybean yield and maturity date of breeding materials using UAV image features and machine learning models., and (3) select elite soybean lines using UAV image features towards improving the selection intensity and accuracy. A UAV-based HTP platform was developed to carry multispectral and high-resolution digital cameras and geo-referencing units. The platform was able to cover a 9-acre field within 2 hours. An automated pipeline was developed to process the collected time-series images and generate labeled image features. It was shown that the developed methods can deliver accurate measurements on plant height (coefficient of determination R[superscript 2] up to 0.90 with average errors within 5 cm) and consistent spectral reflectance. The UAV platform and image processing pipeline were applied to estimate two key agronomic traits for soybean breeding, i.e., the maturity date and yield. A group of image features was collected on 326 soybean progeny lines near their maturity stages (R7-8). Their maturity dates were estimated using a partial least square regression (PLSR) model with the image features as inputs and the visual maturity dates taken by breeders as outputs. The results showed that the image-based maturity dates highly agreed (R[superscript 2] [equals] 0.81) with the visual ones with the root mean square error (RMSE) of 1.4 days. For estimating the soybean yield, 972 soybean breeding plots in three maturity groups were planted under rainfed conditions. A mixed convolutional neural network (CNN) model was built to estimate soybean yield by taking seven image features (associated with plant height, canopy color, and canopy texture) and two categorical factors, i.e. maturity group and drought tolerance, as predictors. The prediction model could explain 78 [percent] of the measured yield with an RMSE of 391.0 kg/ha[superscript -1] (33.8 [percent] to average yield). To model the breeder's selection criteria and select elite soybean genotypes, a soybean breeding program was traced for three years. The progeny trial (PT) had 11,473 rows, and 1,773 among them were selected for a preliminary yield trial (PYT) and 238 were further selected for an advance yield trial (AYT). Seven agronomic traits, including yield, plant height, maturity data, flower and pubescence color, moisture and lodging were manually measured for soybeans in the two yield trials. The UAV imagery was collected every two weeks over the growing seasons, and a group of image features was extracted for each trial. Results show the progeny lines had the most variation among the three trials and the images collected at earlier stages (before R5) explained more variation than those at later stages. A Lasso model for selecting soybean lines with image features correctly identified 71 [percent] and 76 [percent] of the breeder's selection for the PT and PYT. The model selections in PT and PYT had respectively 4 [percent] and 5 [percent] higher yield, comparing the breeder [percent]s selection. In summary, the developed UAV HTP platform is capable of collecting image features of soybean breeding materials efficiently and delivering estimations of agronomic traits accurately. The accurate and subjective estimations of plant traits decrease the phenotypic variations in breeding trials. By liberating human labor from the onerous field evaluation, the population size of soybean breeding lines could be increased a lot, leading to increased selection intensity. Moreover, the proposed variety selection model was able to narrow down the breeder selections, further increasing the selection accuracy and intensity. Therefore, it is be concluded that the developed UAV HTP platform has great potential in improving soybean breeding efficiency by decreasing the phenotypic variations, increasing the selection accuracy and intensity. This research could be scaled up to other crop breeding programs and offered a paradigm of improving the breeding efficiency using HTP technologies.