Search for dissertations about: "eye sensor"
Showing result 1 - 5 of 23 swedish dissertations containing the words eye sensor.
-
1. Resonator sensor technique for medical use : an intraocular pressure measurement system
Abstract : In the work of this doctoral dissertation a new resonator sensor technique, first presented in 1989, has been further developed and evaluated with focus on technical characteristics and applications within the medical field. In a first part a catheter-type tactile sensor using the resonator sensor technique was evaluated in a silicone model and applied to human prostate in vitro. READ MORE
-
2. Keeping Eye and Mind on the Road
Abstract : This thesis is devoted to understanding and counteracting the primary contributing factor in traffic crashes: inattention. Foremost, it demonstrates the fundamental importance of proactive gaze in the road centre area for action guidance in driving. READ MORE
-
3. Peripheral Optics of the Human Eye:Applied Wavefront Analysis
Abstract : I denna avhandling används vågfrontsanalys för att studera ögats perifera optik med betoning på dess betydelse för utvecklingen av närsynthet (myopi). Syftet är att hitta egenskaper i den perifera bildkvalitén som skulle kunna användas av ögat för att reglera dess tillväxt. READ MORE
-
4. Wavefront Aberrations and Peripheral Vision
Abstract : Failing eyesight causes a dramatic change in life. The aim of this project is to help people with large central visual field loss to better utilize their remaining vision. Central visual field loss means that the person has to rely on peripheral vision since the direct vision is lost, often due to a dysfunctional macula. READ MORE
-
5. Deep Learning Applications for Autonomous Driving
Abstract : This thesis investigates the usefulness of deep learning methods for solving two important tasks in the field of driving automation: (i) Road detection, and (ii) driving path generation. Road detection was approached using two strategies: The first one considered a bird's-eye view of the driving scene obtained from LIDAR data, whereas the second carried out camera-LIDAR fusion in the camera perspective. READ MORE