Can you please advise on these two questions. 1. C...
# getting-started
a
Can you please advise on these two questions. 1. Can we write tap in other languages (eg js/ts), I've seen competitors from airbyte allow this. 2. Can we make some custom conditional calls? For example, if among the received data there are events where eventType for example login - then we make a call to our server with restApi? Or is it more correct to first just transport the data to our database, and then analyze the received data on our server?
s
Hey @alex_dimov : 1. Yes, Meltano does support non-Python plugins with a few caveats (let's say "Python is best supported"), here's a longer answer: https://meltano.slack.com/archives/C013EKWA2Q1/p1666305849947129 2. What you're describing sounds very much like an ELT anti-pattern - if I understand you correctly that is. The best practice here would be to import your data source (1), then do your "conditional filtering" get your list of additional values and import them as data source (2), then join them together into a new model. This way is usually way more robust & faster than making conditional calls/look ups as you ingest data.
a
The fact is that now I use a model that creates a connection, then every 15 minutes I do an event poll, if there are new events, then I go through each event, make some modifications in the event, supplement them with additional data from different places, then save them to the database ready-to-show event (which we then show where needed) Now we do all this with js / ts But we want to switch to the pipeline model and think about how to properly build our flow. Thanks for the advice
s
Yes that sounds like a good use case for what I've been describing above (also what I've implemented in the past for Greg Young Event Store event streams).
a
is it opensource?
s
sorry, I meant I implemented it using another E&L tool where the source streams where GY Event Store streams. But I implemented it just the way described above
a
Ah, got it
thanks
a
@alex_dimov - A parent-child relationship between streams is one way this kind of design may be implemented using the SDK:
if there are new events, then I go through each event, ... supplement them with additional data from different places, then save them to the database ready-to-show event
Alternatively, you could also supplemental API calls per record to append additional data during the SDK's
Stream.post_process()
override point.
a
understood you, thank you. Then I continue to figure out how to make it all work.