Proper selection of the drilling parameters and dynamic behavior is a critical factor in improving drilling performance and efficiency. Therefore, the development of an efficient artificial intelligence (AI) method to predict the appropriate control parameters is critical for drilling optimization. The AI approach presented in this paper uses the power of optimized artificial neural networks (ANNs) to model the behavior of the non-linear, multi-input/output drilling system. The optimization of the model was achieved by optimizing the controllers (combined genetic algorithm (GA) and pattern search (PS)) to reach the global optima, which also provides the drilling planning team with a quantified recommendation on the appropriate optimal drilling parameters. The optimized ANN model used drilling parameters data recorded real-time from drilling practices in different lithological units. Representative portions of the data sets were utilized in training, testing, and validation of the model. The results of the analysis have demonstrated the AI method to be a promising approach for simulation and prediction of the behavior of the complex multi-parameter drilling system. This method is a powerful alternative to traditional analytic or real-time manipulation of the drilling parameters for mitigation of drill string vibrations and invisible lost time (ILT). The utilization can be extended to the field of drilling control and optimization, which can lead to a great contribution of 73% in reduction of the drilling time.