Skip to main content
Simon Stijnen
Back to All Projects

Signapse

The demo of the Signapse application in action at VIVES.

The demo of the Signapse application in action at VIVES.

While presenting the tech stack used in the Signapse project.

While presenting the tech stack used in the Signapse project.

Signapse application home page.

Signapse application home page.

Project Overview

An innovative accessibility solution developed for VIVES Project Experience that bridges the communication gap between deaf/hard-of-hearing and hearing individuals. The application leverages advanced computer vision and machine learning techniques to recognize sign language in real-time through the phone's camera and convert it to text and speech.

The project features a multi-model AI pipeline combining PyTorch LSTM networks for sequential analysis with MediaPipe for hand and pose landmark extraction. It supports both ASL (American Sign Language) and VGT (Flemish Sign Language), recognizing individual letters and complete words with high accuracy.

Built with a modular architecture, the solution consists of a React Native mobile app with TypeScript for the frontend, a FastAPI-based Python backend for AI processing, and a robust DevOps setup using Docker containers and Kubernetes for production deployment. The custom smart_gestures package, published on PyPI, enables feature extraction and gesture recognition across different components of the system.

Technologies Used

  • AI
  • Python
  • Kubernetes
  • PyTorch
  • React Native
  • CI/CD
  • FastAPI
  • Docker
  • TypeScript

Related Projects

CERM MCP PoC
A proof of concept email automation agent using Microsoft 365 MCP server and LangChain.
Pop-a-loon
A full stack Chrome extension with 200+ users, featuring interactive balloons and real-time stats.
Final project: Bluetooth Device Localization
Locating devices in a room via bluetooth signal strength.