Discord just made their suspicious-behavior tool open source: Osprey is a high-performing rules engine for real-time event processing and behavioral analysis. It can ingest any platform event, such as login attempts, content posts, account creations, or custom actions unique to their service, and run them through rules that detect and respond to emerging threats in real time, not weeks. Safety and security teams can write expressive rules in a simple language, deploy new rules without any engineering dependencies, and get immediate, transparent decisions on whether something is safe, suspicious, or malicious. How ROOST is Advancing Online Safety Groupchat spam is not only a nusiance, it's also a security risk with phishin links posted on a regular basis. Instead of inventing the wheel twice, I'd like the Lab to look into option of including Osprey for exactly that: detect suspicious patterns in groupchat.