API requests using overlay extension

Hey everyone,

I am making an extension right now and it works using http requests to API Gateway → kinesis data stream.

The thing is that because it is a client side javascript, each viewer is unique and I don’t think it is possible to batch the requests of every viewer into one big data as Lambda do not keep data between uses.

So I am concerned that as viewer number goes up, requests number between twitch-API and API-Kinesis will also go up in the billions per day and become way too expensive.

Do you have an idea how to optimize this so I won’t have to pay 135000$/month ?

Additional info:

I cannot rate limit the requests as they need to be sent real time at least once every 5 seconds.

Cheers!

by not using Lambda?

If you expect a thing to be too expensive to use then the only solution is a different product/solution.

Hmm so using something cheaper is what you mean?

Or do you mean there is a way other than that ?

if you cannot afford to use lambda then you need something you can afford that can do what you need it to do.

Obviously. I’m asking if there is another way though.

I don’t get what you are asking

If you cannot afford to API Gateway → kinesis data stream. then you can’t use API Gateway → kinesis data stream. so you need to select different technologies or build a different solution you can afford that does what you need it to.

Personally for all my extensions I run bare metal, not cloud.
And if I need to scale beyond one server then I would get another server

I get that, but I’m saying that ultimately, The extension will always output data every 5 second for each individual viewers, so I don’t know technicaly how to deal with it in a cheaper way than using api gateway → kinesis.

That’s why I am asking here if someone has any idea

Why are you sending each viewer a unique message back to the viewer every 5 seconds?

Kinda sounds like a use case for a server running a websocket since you exceed what Twitch Extension PubSub can do. Since that won’t do unique messages per viewer easily

Every 5 seconds, every unique viewer votes on the screen overlay extension. The data sent is the vote, which currently gets inputed to kinesis for computation and output the most voted choice.

However when more viewers vote, as there is a vote every 5 seconds when active, billions of inputs might be sent a month, hence my need for a new solution

  • When a viewer clicks to vote
  • That makes a HTTP request to my server
  • I score the vote
  • I send back the results/current score to the channel every 1 second or slower via Twitch Extension PubSub
  • My bare metal server does the counting, don’t need kinesis/cloud for that

If my solution gets close to overloading the server then I’ll add a server to the pool.

Last November/December I was running 3 voting websites and/or extensions off the same single bare metal server. One of which being for The Game Awards.

I got no where near maxing the limit.

At any given point theres between, say 3 to 5 thousand channels live at any given point on Twitch.

Of which, even if you assume half of them are running your extension.
Not all of them will be running a poll to vote on at the same time.

So your peak loads are a concern but your average traffic loads are not.

Billions of inputs/votes can be handle by a single bare metal server if needed without havign to consider the request counts of a cloud solution.

I guess you can start with cloud and if your extension takes off rearchiture the backend to fit within your costing allocations.

Additional case in point I run two game matched extensions which both accept data from the game and sends the collected data to the channels live with the extension each second and it just purrs along. All from the one bare metal server.

Sure theres not generall more than 10 channels live at once with each game extension but that server is also running a hell of a lot of other stuff. Including my IGDB extension which fields quite a fair chunk of traffic at any given point.

1 Like

That’s a really great insight that I absolutely needed. Thanks for putting up with me man…

I thought that every server rental would be priced per requests so I never thought about taking a bare metal one I’m so dumb.

Thanks a lot!!