Human emotion is probably the ultimate thing people say is not computable. But human emotion is indeed being reverse engineered right now through the proxy of sentiment analysis and human biometrics.
Consider the following technological innovations:
\1. Google glass can correlate what you see to how it affects you using pupil dilation
\2. Apple Watch can monitor your heartbeat and skin temp to detect your subconscious state of excitement. See heart rate and galvanic skin temperatures use in lie detectors.
\3. Xbox one watches your face to detect happiness or frustration while playing games. Microsoft has even provided a mechanism in the SDK to let developers write emotion dependent control structures. This way they can make games harder or easier depending on your frustration or excitement levels.
\4. We routinely do voice analysis and textual semantic analysis on things like phone calls and tweets.
By tying these technologies together, we have effectively strapped a lie detector test on the back of everyone. The machines are watching our faces and measuring our neurological responses to stimuli.
If google image search and neural nets have taught us anything, it’s that given a large enough sample size, we can reverse engineer machine recognition and reproduction without being able to formally describe the means to accomplish it. We just drop our samples into a neural net and recognition/reproduction capabilities pop out the other end. Ok, it’s a little more complicated than that but you get the idea.
Don’t be surprised if your gadgets start not only understanding your emotional state, but miming an emotional state back at you. Why? Because humans love to anthropomorphize things and, whether we admit it to not, we would respond favorable to a product that seems to be alive and emotionally dynamic.