This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I can't find anyone covering this issue in a practical manner.
Let's say we have a fairly large SaaS multi tenant Rails app, with a bunch of ActiveRecord models, and a react/vue client to power the SPA front-end.
How do you approach making this app update data for all users in realtime?
I understand most articles out there show that you can use websockets to emit events to the client and listen to them on the frontend, but it's often an over-simplified view that doesn't cover:
- How to abstract out ActiveRecord data sync on both the backend and frontend? (similar to firestore data bind)
- What about race conditions when emitting the update events from activerecord? should there be versioning to avoid the possible issue of an old update event being received after the newest.
I'm asking because i built out a hackish, standardized way to emit changes from Rails via pusher on all models:
class ApplicationRecord < ActiveRecord::Base
self.abstract_class = true
after_commit :sync_payload
def sync_payload
if respond_to?("pusher_channel_name")
channel = pusher_channel_name
event = "#{self.class.name.underscore.dasherize}-updated"
return if ch.blank? || ch.is_a?(Array) && ch.empty?
if channel
if destroyed?
push_payload(channel, event, { id: id, _destroyed: true })
else
push_payload(channel, event, as_json)
end
end
end
end
end
On the client side (Vue), i built out mixin methods to listen for these events and change data.
As you can see this is subject to race conditions and it doesn't make sure events are sent in order, and there's various concerns on how reliable this is and if users will always have up-to-date data.
I'm curious to see how others approach this problem.
Post Details
- Posted
- 4 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/rails/comme...