Ph.D. in Robotics · Control Engineer

Jan Smisek

Translating advanced control theory into real-world systems—from force-feedback robotics on the International Space Station to laser communication terminals linking satellites in orbit.

LinkedIn
Jan Smisek operating haptic device

Profile

With over a decade of experience in technical leadership, I have focused on developing high-performance systems in challenging environments. Currently, I lead the Space and Defense technology group at Transcelestial in Singapore, overseeing the development of laser-based communication systems for aerospace and defense applications.

Previously, my work at the European Space Agency (ESA) Telerobotics & Haptics Lab involved architecting control systems for telerobotics. This included contributions to the Haptics-2 experiment, which facilitated the first force-feedback handshake between the International Space Station and Earth.

Prior to that, I led the robotics development at Speedcargo, delivering AI-powered logistics solutions for aviation. I hold a Ph.D. in Robotics from Delft University of Technology, with a research focus on haptic shared control and human-machine interaction.

Experience

2020 — Present

Transcelestial

Head of Space & Defense

Responsible for strategy and engineering execution within the space and defense verticals, managing a portfolio of R&D contracts. The team develops laser communication terminals, including a space-to-ground system deployed into Low Earth Orbit in 2023. My role covers the full development lifecycle, from architectural design using ROS2 and Simulink to HIL/SIL testing and mission operations.

2018 — 2020

Speedcargo

Lead Robotics Engineer

Led the engineering team in the development of an AI-powered robotic aviation cargo packaging system. By implementing ML-driven perception and real-time grasp planning in ROS, the system achieved manipulation speeds comparable to human operators, handling a wide variety of package types in deployment at Changi Airport.

2012 — 2017

European Space Agency

Robotics & Control Engineer

Contributed to the development of real-time control software for spaceflight experiments (Interact, Haptics-1 & 2) deployed on the ISS. Work included the implementation of force/torque control algorithms for commercial arms (KUKA) and autonomous navigation software for rover platforms, facilitating teleoperation tasks with significant time delays.

Selected Publications

Haptic guidance on demand: A grip-force based scheduling of guidance forces

IEEE Transactions on Haptics, 2018

Smisek, J., Mugge, W., Smeets, J.B.J., van Paassen, M.M., Schiele, A.

Neuromuscular-System-Based Tuning of a Haptic Shared Control Interface

IEEE Transactions on Human-Machine Systems, 2017

Smisek, J., Sunil, E., van Paassen, M.M., Abbink, D., Mulder, M.

3D with Kinect: High Accuracy Depth Measurement

IEEE International Conference on Computer Vision, 2011

Smisek, J., Jancosek, M., Pajdla, T.

ESA Interact Mission

ESA INTERACT

Space Telerobotics Real-time Control

The Interact experiment demonstrated end-to-end real-time robotic control from Space. It verified that sophisticated robots on Earth could be effectively teleoperated from the International Space Station (ISS) with high-fidelity haptic feedback, validating the necessary space communication protocols.

Technical Context

The mission involved controlling a rover situated 400 km below via the TDRSS geosynchronous satellite constellation. This setup introduced variable time delays of up to 0.8 seconds—a significant factor for closed-loop haptic control stability.

Key Contributions

  • Embedded Systems: Developed the embedded real-time C++ software core, integrating disparate hardware components via RTI DDS middleware.
  • Control Interface: Researched and implemented a specialized human-machine interface ensuring safe robotic arm operation over lossy, high-latency networks.
  • System Integration: Designed and integrated the power, safety, and communication subsystems using industrial-grade electronics for reliable operation.
  • Vision & Autonomy: Implemented model-based task-space control using visual markers, enabling the rover to align for precise connector mating tasks.
6G StarLab

6G StarLab

Laser Communication R&D

Transcelestial's laser communication terminal has been space-qualified for Europe's 6G StarLab mission. This initiative focuses on developing next-generation wireless technologies for future 6G networks, utilizing advanced optical communication systems to achieve ultra-high bandwidth connectivity.

Mission & Technology

The 6G StarLab project, led by the Institute of Space Studies of Catalonia (IEEC) and Open Cosmos, aims to test 6G technologies in Low Earth Orbit (LEO). Transcelestial's laser terminal is a critical component, enabling high-speed data links between the satellite and ground stations.

Key technological advancements include robust beam pointing and tracking mechanisms essential for maintaining stable optical links in the dynamic LEO environment. This mission represents a significant step towards a future where satellite networks provide ubiquitous, high-speed connectivity.

Intersatellite Laser Link

Intersatellite Laser Communication

Space Collaboration

This project marks Singapore's first inter-satellite laser communications mission, a strategic collaboration between Transcelestial, OSTIn, and ST Engineering Satellite Systems. The mission's primary objective is to demonstrate high-speed, secure optical data links between satellites in orbit.

Strategic Impact

By establishing optical cross-links between satellites, this technology enables a space-based mesh network. This allows data to be routed directly between satellites, bypassing the need for constant ground station visibility and significantly reducing latency for global data transmission.

Transcelestial's role involves providing the sophisticated laser terminals capable of precise beam acquisition and tracking required to maintain a connection between two fast-moving objects in space.

ESA Haptics-1 Mission

ESA HAPTICS-1

ISS Haptics Human Physiology

On July 28th, 2014, the European Space Agency launched the Haptics-1 Kit to the International Space Station (ISS) aboard the Automated Transfer Vehicle ATV-5. Arriving two weeks later, it became the first haptic master device to enter the ISS.

The experiment, which commenced on December 30th, 2014, marked the first force-feedback and human perceptual motor performance tests in the history of spaceflight. Three astronauts participated in the study until November 2015, providing crucial data on the effects of microgravity on psycho-motor performance metrics related to haptic feedback usage. Experiments were conducted after full adaptation to the space environment (3 months in orbit).

Technical Construction

The Haptics-1 joystick is a high-performance mechatronic system designed for spaceflight. It features:

  • Active Joint Impedance Control: Enabled through a custom torque sensor at the joint output.
  • High-Power Actuation: Driven by a highly power-dense RoboDrive ILM brushless motor.
  • Real-Time Architecture: Runs internally on a 2kHz sample rate in hard real-time using an Intel Atom Z530 (1.6 GHz) embedded computer.
  • Communication: Utilizes a real-time EtherCAT bus for internal communication between the computer and joint motor controller.

Key Findings

Detailed experimental results from the first stiffness Just Noticeable Difference (JND) study in space showed no major alterations in-flight compared to on-ground data, provided the manipulandum is secured against a sufficiently stiff reference structure.

ESA Haptics-2 Mission

ESA HAPTICS-2

ISS Haptics Model-Mediated Control

HAPTICS-2 was the first project to demonstrate advanced bilateral teleoperation between Space and Ground. The project aimed to explore the fundamental limits of state-of-the-art teleoperation systems over a geostationary satellite relay link.

Project Outcome

The experiment successfully connected a haptic joystick on the ISS with a counterpart at ESA's ESTEC laboratory. It demonstrated that valid force feedback cues could be transmitted and felt by an astronaut, even with round-trip delays approximating 850 milliseconds.

Technical Implementation

To achieve stability while maintaining transparency, novel model-mediated control extensions were developed. These algorithms locally simulated the remote environment's stiffness and damping, updating the model upon the arrival of new data. Real-time linear/nonlinear estimators (KF, EKF, RLS) were also implemented to improve the fidelity of the force feedback signal in the presence of noisy telemetry.

Interact Rover

Interact Rover Platform

Mobile Robotics Stereo Vision Localization

The Interact Rover is a custom-integrated 4WD mobile platform equipped with two 7-DoF robot arms and a stereo camera head system. It served as the terrestrial unit for astronauts performing assembly tasks from orbit.

System Design

Software development focused on the platform's mobility and localization. This included interfacing with commercial 4WD controllers and developing a skid-steering kinematic model suitable for the loose terrain of the "Mars Yard" test facility.

Visual Guidance

A visual marker system was integrated to assist the operator. The rover identified task boards and calculated the relative pose of its arms in real-time, allowing for "supervisory control" modes where the astronaut could command high-level alignment actions before assuming manual control for precise insertion tasks.

PhD Research

Systematic Framework for Haptic Shared Control

PhD Research Human-Machine Interaction

Doctoral research at TU Delft focused on formalizing the design of Haptic Shared Control (HSC) systems. HSC operates between full automation and manual control, where both the human operator and the automation exert forces on the control interface.

Key Contribution: Grip-Force Scheduling

A primary contribution was "Haptic Guidance on Demand." This proposed a system where the authority of the automation (the stiffness of the guidance) adapts in real-time based on the operator's grip on the control stick.

This mechanism allows the operator to override automation during conflicts or unexpected events by gripping the stick firmer, creating a natural "manual override" without the need for additional switches.

Impact

The framework offers a systematic approach to tuning guidance forces, serving as an alternative to heuristic trial-and-error methods. It aims to produce systems that are safer, more comfortable, and which maintain the operator's situational awareness.