The LHCb experiment is designed to search for new physical phenomena in proton-proton collisions at the LHC. To do this, the LHCb experiment processes proton-proton collisions at a rate of 30 MHz, producing a data volume of about 5 TB/s. This data volume must be reduced by orders of magnitude to allow for long-term storage. The reduction is performed by analyzing the data in real time using a two-stage software trigger that selects a small subset of data for storage. The first stage, called Allen, processes the full 5 TB/s of data produced by the LHCb detector using GPUs. This GPU-based trigger offers improved speed and flexibility, allowing the LHCb experiment to extend its physics reach using previously unfeasible reconstruction and analysis techniques.