Tablet computers are advanced sensor-rich units equipped with high computational power and extended communication capability. Even though their primary usage is in consumer electronics, we believe they have high potential in human-robot interaction. Tablets can be programmed to replicate various functions of the head such as vision, hearing, speaking and facial articulations. In this work, we present tablet computer-based robot head ChibiFace. Built on top of the Android operating system, it uses native Android and third party open-source libraries to implement various audio and visual signal processing algorithms, machine learning and system integration with different robot platforms. Experiments carried out in industrial robotics settings show that ChibiFace can enhance the safety of human-robot interaction. Rapid pace of advancement and functional expansion in tablet computers driven by the intense market competition foretells that highly capable, portable and customizable tablet-based human-robot interfaces might be widely accessible in the near future.