This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I'm having trouble understanding how I'm supposed to use luasocket to efficiently receive data. My application is a TCP server that waits for clients to connect and then will handle them all in parallel. I decided to go with non-blocking server socket and polling using select over all sockets: the actuall connected sockets and the listening socket.
Receive takes a parameter which is supposed to be the number of bytes it will read. If I set it to 100 and have the socket in non-blocking mode (timeout 0) then receive will return "timeout" on the second returned value and throw away the 50 bytes. Why not keep those 50 bytes for the next receive I do? Or even better why not return both the 50 bytes AND the "timeout" error?
I can make my application work if I set the receive size to 1 byte, but that means that I'll need to aggregate byte by byte in an array, and the concatenate it when I need the data.
What am I missing here?
EDIT: Here's the critical part of the code: https://pastebin.com/raw/Q9WTQUKH
Post Details
- Posted
- 1 year ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/lua/comment...