To help ease companies into a deeper understanding of what AI can provide, our data science team tries to avoid black box solutions for the first deployment of a model. They are highly complex and don’t allow you to see how the model determined predictions. Instead, we prefer to start with algorithms that provide explanations for their predictions. That way we can better validate the reasoning to ensure the model derived answers in sensible ways.
Once the model is consistently performing in line with identified metrics, and the data pipeline is stable and understood, you have the option of exploring powerful black box solutions such as neural networks.