As sensor technologies become pervasive and their precision increases, we are creating a ‘fertile substrate’ for augmenting human senses. For example, digital cameras have greater sensitivity to light changes than the human eye and modern buildings incorporate a variety of sensors for collecting data. However, often these technologies are often disconnected from Humans. The overarching topic of this talk is centered on the design and development of mobile assistive technology, user interfaces and interactions that seamlessly integrate with a user’s mind, body and behavior, providing an enhanced perception. We call this ‘Assistive Augmentation’. Creating such Assistive Augmentations poses a twofold challenge as they require: (1) novel hardware technologies and interfaces that capture relevant sensory information, understand the physical environment they are used to, while being unobtrusive. (2) holistic design approach to increase efficiency, support independence and social acceptance to account for real-world applicability. Using modern biological understanding of sensation, emerging electronic devices, computational methods and design thinking approach, we now have an opportunity to design a new generation of Assistive Augmentations. This talk will present several proof of concept Assistive Augmentations for enhancing human I/O in the focus areas of assistive technologies, novel input strategies, smart health and well-being, and interactive learning technologies.