This article will provide a general overview on how to duplicate/export data to your database with Stream Chat. This data can allow you to integrate a 3rd party data analytics platform. The Stream dashboard does not currently have a data analytics platform, but you can easily integrate a 3rd party analytics platform with the two options shown below. If you have any further questions, please reach out to support at https://getstream.io/contact/support/!
1. Copying in Realtime (Webhook & SQS)
The first option is copying in realtime with the Stream Chat Webhook or SQS, an external database or data lake, and a 3rd party service (e.g. Segment, Mixpanel, Snowflake etc.) This is also a common way customers copy data in their own databases to have complete records and for purposes beyond analytics.
1. The first step is to replicate the data to your database or data lake via the Webhook or SQS. You can do this via your code base on your server.
2. Then you can run analytics on this data with the 3rd party service of your choice (e.g. Segment, Mixpanel, Snowflake etc.) Below are more details on the Webhook and SQS with Stream Chat.
Webhook
The Webhook is triggered on a number of events. If you're looking to add messages into a database, the most important events to listen for will likely be the following:
- message.new - new message is sent
- message.deleted - message is deleted
- message.updated - a message is updated
then upsert database entries as needed. Below is a gist containing a very simple webhook handler in Node.
https://gist.github.com/shodgetts/6ccfba9e7313b2be4ecf0b285890f4ad
Webhook Docs - https://getstream.io/chat/docs/node/push_webhook/?language=javascript
SQS
Amazing SQS is an alternative to the traditional Webhook that is potentially more reliable as it is a queueing mechanism, rather than a more ephemeral Webhook payload. The above logic is essentially the same for handling SQS entries.
2. Export Channel API Method
The second option is to use the channel export function. This asynchronous function will give you the entire history of all messages and metadata of the channels which you request. The endpoint will return a job number, which you can then query the API with later (usually a matter of seconds) to receive a secure JSON export of the channel in an S3 bucket! It is perfect for a Live Stream event, App with the main channel, or any kind of event where you’ll gather a sizable amount of data in a set amount of time.
1. The steps are similar to the Webhook/SQS steps noted above. First, you will want to set up a server-side script that uses the channel export function (docs below) and then sends the data to your database or data lake. A fairly simple script could be written to fetch X number of messages from channel Y. For larger pulls, you may need to paginate and/or throttle.
https://getstream.io/chat/docs/channel_pagination/?language=js
2. As with option one above, you can then run analytics via the 3rd party service of your choice
Export Channel Docs - https://getstream.io/chat/docs/javascript/exporting_channels/
Comments
0 comments
Please sign in to leave a comment.