Light exposure and visual experience are key drivers of circadian timing, melatonin regulation and ocular development, but real-world wearable datasets are large, sometimes messy, and hard to analyse reproducibly and with scientific robustness. LightLogR is an open-source R package that streamlines import, validation, analysis and visualisation of light-logging data in tidy, FAIR-oriented workflows—developed specifically for these challenges. (Documentation)
The Translational Sensory & Circadian Neuroscience Unit (MPS/TUM/TUMCREATE) are running an online course series, “Open and reproducible analysis of light exposure and visual experience data”, featuring live walkthroughs of reproducible analyses, including LightLogR tutorials, practical examples, reusable code, and dedicated Q&A, with a certificate upon completion. (Flyer)
Next date: Advanced course – 9 December 2025 (online), focusing on merging light with other data streams, such as sleep or activity, computing advanced metrics, handling multi–time-zone datasets, and working with spectral and distance measures.
An interactive course page — including a recording of the first beginner workshop — is available at the LightLogR webinar page. This is a great opportunity for SLRCH members to skill up on robust, shareable analysis pipelines for wearable light and visual experience data.

