Tagged: radar

FOSDEM 2024 Videos now Available: Synthetic Aperture WiFi RADAR, GPU DSP Acceleration and more

FOSDEM (Free and Open Source Developer’s Meeting) is a yearly conference that took place in Brussels, Belgium on 3 - 4 February 2024. This conference featured a room on Software Defined Radio and Amateur Radio.

Recently the videos of most the talks have been uploaded to their website. Some interesting talks include:

Covert Ground Based Synthetic Aperture RADAR using a WiFi emitter and SDR receiver

Link to Talk Page

Using a WiFi emitter as radiofrequency source illuminating a scene under investigation for slow movement (e.g. landslides), a Ground-Based Synthetic Aperture RADAR (GB-SAR) is assembled using commercial, off the shelf hardware. The dual-channel coherent Software Defined Radio (SDR) receiver records the non-cooperative emitter signal as well as the signal received by a surveillance antenna facing the scene. Spatial diversity for azimuth mapping using direction of arrival measurement is achieved by moving the transmitter and receiver setup on a rail along a meter-long path -- the longer the better the azimuth resolution -- with quarter wavelength steps. The fully embedded application runs on a Raspberry Pi 4 single board computer executing GNU Radio on a Buildroot-generated GNU/Linux operating system. All development files are available at https://github.com/jmfriedt/SDR-GB-SAR/

Synthetic Aperture RADAR with WiFi and USRP SDR

Using GPU for real-time SDR Signal processing

Link to Talk Page

GPU processors have become essential for image or AI processing. Can they bring anything to real-time signal processing for SDR applications? The answer is yes, of course, but not all classic algorithms (FIR, DDC, etc.) can be used "as is", sometimes a different approach must be taken. In this presentation, I will share the solutions that I implemented to achieve multi-channel DDC on NVIDIA Jetson GPU and will make a comparison with "classic CPU" approaches.

Using GPU's for Real Time Signal Processing

Maia SDR: an open-source FPGA-based project for AD936x+Zynq radios

Link to Talk Page

Maia SDR is an open-source project with the main goal of promoting FPGA development for SDR and increasing the collaboration between the open-source SDR and FPGA communities. Currently it provides a firmware image for the ADALM Pluto and other radios based on the AD936x and Zynq. This firmware can display a real-time waterfall at up to 61.44 Msps in a WebSDR-like interface using WebGL2 rendering, and record IQ data in SigMF format in the SDR DDR. The FPGA design is implemented in Amaranth, an Python-based HDL, and the software stack is implemented in Rust, targetting the embedded ARM CPU and WebAssembly.

The first firmware version was released in February 2023, and the project was presented in June in the Software Defined Radio Academy. In this talk we cover the progress since the summer, including the addition of support for devices such as the Pluto+ and AntSDR. We focus on the technical details of the project and the possibilities for re-using some of the components in other projects.

Maia SDR

DAPNET: Bringing pagers back to the 21st Century

Link to Talk Page

When talking about pagers, most of us will think about an object of the past, often seen in TV shows from the 90s, used by medical staff and businessmen. However, they're an interesting way to get simple data broadcast over amateur radio frequencies, with receivers that can be built for less than 20€. We'll explore this and understand how an extensive network can be deployed with simple equipment and using open source hardware and software.

DAPNET Talk

Building a Drone Tracking Radar with the ADALM-PHASER and PlutoSDR

The ADALM-PHASER is a kit designed to provide experience with phased array beamforming and radar concepts. The kit consists of a PlutoSDR, mixers, LO synthesizer, ADAR1000 beamformer chip, LNAs and array of patch antennas. It operates between 10-11 GHz, has 500 MHz BW FMCW chirps, and has 8 receive channels and 2 transmit channels. It is an open source kit that costs US$2800, and it is produced and available from Analog Devices. Currently the kit appears to not be in stock, but they note that they are working on getting more stock in soon.

The ADALM-PHASER a phased array kit for implementing radar and other phased array experiments.
The ADALM-PHASER a phased array kit for implementing radar and other phased array experiments.

Over on YouTube, Jon Kraft who appears to be affiliated with Analog Devices, is working on a series of videos that will ultimately result in a drone tracking radar being built with the ADALM-PHASER. Currently two videos have been released.

The first is an overview of radar concepts, giving an explanation of pulsed vs CW radar, and the various hardware options we have to implement low cost versions of these methods.

The second video covers more radar concepts like range resolution and shows us how to build a CW radar with the ADALM-PHASER system.

The three remaining videos are yet to be released, so keep an eye on his channel for updates.

Build Your Own Drone Tracking Radar: Part 1

Build Your Own Drone Tracking Radar: Part 2 CW Radar

A LimeSDR Mini Based Doppler Radar

Thanks to Luigi (aka @luigifcruz and PU2SPY) for writing in and submitting to us his LimeSDR based doppler radar blog post. The LimeSDR Mini is a low cost two port TX and RX capable SDR. Luigi's doppler based radar makes use of one TX port to transmit the radar signal, and the RX port to receive the reflection. The idea is that the if the object being measured is moving, the received reflected signal will be altered in phase due to the doppler effect.

In terms of hardware, Luigi's radar uses the LimeSDR Mini as the TX/RX radio, a Raspberry Pi 3 as the computing hardware, an SPF5189Z based LNA on the RX side, and two cantenna antennas. It transmits a continuous wave signal at 2.4 GHz.

Luigi's LimeSDR Based Doppler Radar
Luigi's LimeSDR Based Doppler Radar

On the software side it uses a GNU Radio program to transmit, receive and process the returned signal. Luigi's post goes over the DSP concepts in greater detail, but the basic idea is to measure the phase shift between the transmitted and reflected signal via a Multiply Conjugate block, and then decimate the output to increase the resolution. The result is then output on a frequency domain waterfall graph. The GNU Radio is all open source and available on Luigi's Github.

In order to test the system Luigi first set up a test to measure an electric fan's blade speed. The result was clearly visible line in the spectrogram which moved depending on the speed setting that the fan was set to.

Software Defined Radar - Continuous Wave Doppler Radar w/ LimeSDR

In his second test Luigi measures the speed of vehicles by placing the radar on the sidewalk, pointed at cars. The result was clear indication of the vehicle passes as shown by the longer vertical lines on the graph below. The smaller lines have been attributed to pedestrians passing by.

LimeSDR Vehicle Doppler Radar Results: Each long line indicates a vehicle, and shorter lines indicate pedestrians.
LimeSDR Vehicle Doppler Radar Results: Each long line indicates a vehicle, and shorter lines indicate pedestrians.

In a third test, Luigi measured vehicle speeds in tougher conditions, with the radar placed 50 meters away from the highway, at 45 degrees, and with weeds in the way. The radar still generated obvious lines indicating vehicles passes. Finally, in his fourth test, Luigi tested the speed accuracy of his radar by measuring a car driving at a known speed. The results showed excellent accuracy.

Software Defined Radar - Continuous Wave Doppler Radar w/ LimeSDR

Tracking People Through Walls with WiFi Passive Radar

For a while now researchers at MIT and several other universities have been investigating methods for using frequencies in the WiFi bands to see through walls using a form of low power radar. The basic concept is to track and process the reflections of these signals from peoples bodies.

Recently researchers at MIT have taken this idea a step further, combining the radar results with machine learning in a project they call RF-Pose. The result is the ability to recreate and track full human post information through walls. The abstract from their paper reads:

This paper demonstrates accurate human pose estimation through walls and occlusions. We leverage the fact that wireless signals in the WiFi frequencies traverse walls and reflect off the human body. We introduce a deep neural network approach that parses such radio signals to estimate 2D poses. Since humans cannot annotate radio signals, we use state-of-the-art vision model to provide cross-modal supervision.

Specifically, during training the system uses synchronized wireless and visual inputs, extracts pose information from the visual stream, and uses it to guide the training process. Once trained, the network uses only the wireless signal for pose estimation. We show that, when tested on visible scenes, the radio-based system is almost as accurate as the vision-based system used to train it. Yet, unlike vision-based pose estimation, the radio-based system can estimate 2D poses through walls despite never trained on such scenarios.

The hope is that this technology could one day be used as a replacement for camera based computer vision. It would be a non-intrusive method for applications like gaming, monitoring the elderly for falls, motion capture during film making without the need for suits and of course for gathering data on peoples movements.

It is not mentioned in the paper, but it is likely that they are using some sort of SDR like a USRP for receiving the signals. It's possible that a lower resolution system could be set up cheaply with a HackRF and some passive radar software.

RF Pose Estimating Human Pose Behind walls using RF signals in the WiFi frequencies.
RF Pose Estimating Human Pose Behind walls using RF signals in the WiFi frequencies.
Multiple people tracked with RF-Pose
Multiple people tracked with RF-Pose
AI Senses People Through Walls

Aerial Landmine Detection using USRP SDR Based Ground Penetrating Radar

Over the last few years researchers at Universidad Javeriana Bogotá, a University in Colombia, have been looking into using SDRs for aerial landmine detection. The research uses a USRP B210 software defined radio mounted on a quadcopter, together with two Vivaldi antennas (one for TX and one for RX). The system is then used as a ground penetrating radar (GPR).  GPR is a method that uses RF pulses in the range of 10 MHz to 2.6 GHz to create images of the subsurface. When a transmitted RF pulse hits a metallic object like a landmine, energy is reflected back resulting in a detection.

Recently they uploaded a demonstration video to their YouTube channel which we show below, and several photos of the work can be found on their Field Robotics website. We have also found their paper available here as part of a book chapter. The abstract reads:

This chapter presents an approach for explosive-landmine detection on-board an autonomous aerial drone. The chapter describes the design, implementation and integration of a ground penetrating radar (GPR) using a software defined radio (SDR) platform into the aerial drone. The chapter’s goal is first to tackle in detail the development of a custom designed lightweight GPR by approaching interplay between hardware and software radio on an SDR platform. The SDR-based GPR system results on a much lighter sensing device compared against the conventional GPR systems found in the literature and with the capability of re-configuration in real-time for different landmines and terrains, with the capability of detecting landmines under terrains with different dielectric characteristics.

Secondly, the chapter introduce the integration of the SDR-based GPR into an autonomous drone by describing the mechanical integration, communication system, the graphical user interface (GUI) together with the landmine detection and geo-mapping. This chapter approach completely the hardware and software implementation topics of the on-board GPR system given first a comprehensive background of the software-defined radar technology and second presenting the main features of the Tx and Rx modules. Additional details are presented related with the mechanical and functional integration of the GPR into the UAV system.

Drone with USRP Ground Penetrating Radar Setup
Drone with USRP Ground Penetrating Radar Setup
Aerial landmine detection using SDR-based Ground Penetrating Radar and computing vision

PiAware Radar – A Traditional Radar-Like Display for ADS-B, and Setting up an ADS-B Cockpit Flight Display

PiAware Radar is a Python script that connects to your PiAware server and uses the received ADS-B data to display a familiar radar-like display (green circle with rotating radius, and aircraft displayed as blips). PiAware is the software used to take ADS-B data from an RTL-SDR dongle running on a Raspberry Pi and feed flightaware.com. A radar-like display is probably not very useful, but it could be used to set up an interesting display that might impress friends. Over on his blog IT9YBG has uploaded a tutorial that shows how to set PiAware Radar up on a Raspberry Pi.

Also on his blog IT9YBG has uploaded another tutorial that shows how to set up 1090XHSI, which is a program that displays an 737 aircraft cockpit simulation using live ADS-B data. The ADS-B data updates the instrument displays in real time, giving you a view of exactly what the pilots might be seeing on their dashboard of their aircraft. We posted about this software in the past, but IT9YBG's tutorial helps make it much easier to set up.

PiAware Radar
PiAware Radar
1090 XHSI 737 Cockpit Simulation from ADS-B Data
1090 XHSI 737 Cockpit Simulation from ADS-B Data

Analyzing Radar Pulses with Baudline and an RTL-SDR

Over on YouTube user Albert Schäferle has uploaded a short video showing his reception of some radar pulses and their corresponding echoes. He uses rtl_fm and pipes the output into Baudline which is used to display the radar waveform. On the video description he writes:

Receiving direct and (supposedly) reflected pulses from an L-band radar in Učka, HR (Lockheed Martin AN/FPS-117). The receiving station was 83 km away, with clear LOS.
Center frequency is 1258 MHz (one out of four that this frequency-agile radar head is using).
The receiver is a RTL-SDR dongle (R820T tuner IC) with a 2-dipole collinear array (tuned for 403 MHz) and approx 7 m of Belden 1694A RG-6 coax.

rtl_fm output was piped to baudline, which is the software shown in the video. The IQ sampling rate is 2 MHz; the transform is a complex STFT (size=2048 samples, Blackman window).
This is a 0.008x speed playback of 15 ms of recording.
The (again, supposedly) reflected pulses are obviously more time-local with a shorter transform window size, e.g. 512 samples http://i.imgur.com/sAHWhwD.png

The effect of pulse compression is quite evident http://www.radartutorial.eu/08.transm…
The direct-reflected delay is approx 278 µs (~42 km from receiver, in a simple 2D, along beam, normal incidence model). I should add that this “reflection delay” effect does not usually show up.
There’s another fainter echo closer to the pulse, but I suspect that it could be a time-sidelobe of the main pulse: a side effect of pulse compression. Anyway, I must state that I have no formal knowledge on radar topics. So you’d better take all this with a grain of salt 😉

Link to recording: https://db.tt/Lxe67Ig3 (save destination as…)

Video recorded with VLC, audio piped to stdout and saved, then synced in Blender.

Radar WGS84 coordinates: 45.286757,14.202732 http://www.panoramio.com/photo/26952908

Analyzing radar pulses with Baudline and RTL-SDR.