Stream Up/Down Webhook and new endpoints for Clips and Top Games

The Stream Up/Down webhook immediately notifies you when a streamer goes online, offline, or starts a Vodcast. Now your integrations no longer have to rely on polling to see when a streamer comes online, and your applications can react to streamers coming online in near real time!

Get Clip allows you to get information on Clips by their ID, such as who created it, which creator was streaming, or what game they were playing.

Create Clip allows you to programmatically make Clips on streams with a simple API call.

We’ve added a way for you to see the top games currently on Twitch! Just like in v5 of the API, you’re now able to quickly query the Get Top Games endpoint and see what’s popular.

We’ve had a lot of questions about how to increase rate limits. Please fill out this form if you’d like to request a rate limit increase for Webhooks and/or the new Twitch API, and we can start that discussion with you.

Please read the announcement on the developer blog for more details. Our apologies for the delay in getting this on the forums. Our goal is to make announcements on the blog and the Announcements forum category simultaneously.

Docs for the Stream Up/Down webhook list 2 different endpoints: helix/streams and helix/users/streams (should probably specify which one is actually correct). Another doc change would be mentioning what happens when the stream goes down, since the provided example is only for up. Additionally, is there any chance that the webhook will have a reduced delay any time soon? Since right now it’s quite slow, and polling the v5 API can be minutes faster.

The delay was already discussed in the TwitchDev Chat server and actually the webhook notices the stream events within seconds but waits for the helix endpoint to pick it up bcs there is some sort of cache in between. But they work on reducing the delays!

There’s unfortunately some inconsistent caching of stream data in some of our internal services
Our Webhooks service actually knows about the stream going up/down before our API, but we wait for all caches to be cleared before we send the payload
This way the webhook payload isn’t ever inconsistent with our API endpoint (although there might be a delay)
We’re resolving this at the beginning of next year

that what was said there

I really hope they manage to fix this then. If it’s delayed 5-10 minutes, it kinda defeats the entire purpose of using it for real-time events.

Yeah right now it’s just a “we’ll tell you. So you don’t waste rate limit polling”