The Dynamic Search Fireworks Algorithm (dynFWA) is an effective algorithm for solving optimization problems. However, dynFWA is easy to fall into local optimal solutions prematurely and it also provides a slow convergence rate. To address these problems, an improved dynFWA (IdynFWA) is proposed in this chapter. In IdynFWA, the population is first initialized based on opposition-based learning. The adaptive mutation is proposed for the core firework (CF) which chooses whether to use Gaussian mutation or Levy mutation for the CF according to the mutation probability. A new selection strategy, namely disruptive selection, is proposed to maintain the diversity of the algorithm. The results show that the proposed algorithm achieves better overall performance on the standard test functions. Meanwhile, IdynFWA is used to optimize the Extreme Learning Machine (ELM), and a virtual machine fault warning model is proposed based on ELM optimized by IdynFWA. The results show that this model can achieve higher accuracy and better stability to some extent.