The developments of artificial intelligence (AI) are growing along with its applications. This growth is so quick that it often surprises even researchers who had hypothised different times. Within the field of criminal profiling this is interesting because it can help to recognize errors and biases which are typical of humans [1]. Even though training AI to recognize emotions based on biometric parameters is becoming easier, the subsequent analyses are problematic. In fact, it is difficult to interpret biometric data which are also influenced by cultural and social factors. In terrorism analysis, for instance, the behaviors that are analyzed are different among the different groups or tribes. Therefore, the influence of social factors goes beyond the analysis of the complex neural responses [2-4]. Another element that plays a role is in the interpretation of emotions for the judicial system is represented by ethical and moral factors [5]. Artificial Intelligence cannot be used for reconstructing the origine of a crime [6] and only an expert’s opinion can be considered reliable [7]. Only an analysis based on the individual and aspects, and only the knowledge of the psychopathology, together with the scientific analysis of the non-verbal language can help reconstructing the origin and the dynamics of the crime [10-12]. In conclusion, even though AI offers an important support, since it can speed up some processes of the analysis, it currently cannot replace humans when it comes to profiling [13-14]. In light of the chosen method, the analyses are ongoing, and the initial results indicate a trend toward greater reliability for profiling conducted by a human compared to that performed by AI. This is not due to the AI's capacity for emotional recognition but rather to the methodology employed by the AI. Humans respond to any sensory stimulation with an emotion, making any inference, reasoning, or behavioral choice closely dependent on the emotion experienced. In contrast, AI recognizes emotions through a process of analysis comparable to purely cognitive processes. Consequently, the capacity for emotional recognition through empathy is lacking. To guarantee the best possible analysis and limit the possibility of moral and ethical issues, it is extremely important for a human to oversee this process. AI can be used to recognize emotions based on biometric alterations, but it should not go further than that. Relying solely on its conclusions would be sterile and incomplete, and from a legal standpoint, could impact the admissibility of the analysis in court.