Running into some problems upgrading `target-mssql...
# singer-target-development
r
Running into some problems upgrading
target-mssql
to a newer SDK version, where it is not playing nicely with the SQLConnection refactor back in
v0.20.0
... I'm getting an
Invalid object name
error, which suggests to me some issue with the target engine/connection implementation (works fine on
v0.19.0
, breaks on
v0.20.0
). Would appreciate some pointers if anyone has some experience here. 🙂 `connector.py`: https://github.com/storebrand/target-mssql/blob/56968e01ab9f7295ef9a0aeeec96459353185d70/target_mssql/connector.py SDK
v0.19.0
<-
v0.20.0
diff: https://github.com/meltano/sdk/compare/v0.19.0...v0.20.0?diff=split&amp;w=#diff-db20b40a2eb1ac17938f49d9757779cdb9998129d7fc075a964d20096ddb4b48
b
Maybe try looking the for anything that is using
self.connection.execute
which is deprecated and replace it with a variation of this code I grabbed from the SDK's SQL Sink bulk_insert_records.
Copy code
with self.connector._connect() as conn, conn.begin():
    result = conn.execute(insert_sql, new_records)
Working with this in a test project I am starting to think that the temp table is being deleted because the connection is closed before the insert occurs.
r
Yeah, I already tried the context manager fix locally but I hit the same problem. I think you're right on the connection being closed, but I can't see what change in the SDK would have caused that.
b
I think it is this which can be found in the SQLConnector class..
Copy code
@contextmanager
    def _connect(self) -> t.Iterator[sa.engine.Connection]:
        with self._engine.connect().execution_options(stream_results=True) as conn:
            yield conn
When you call
_connect
you are dealing with a context manager that I believe will close the connection given once an execution occurs and finishes. Off the top of my head your options are finding a way to have the Sink create and sustain a connection from
_engine
for the entire merge insert process or move to using permanent tempdb tables that will stay once a connection closes but you will need to track and clean them up.
Worked on this a little more and ran into the fact you need rights to tempdb in order or create permanent table there. Which I could see being problematic for some people to get. I did find that if you create a
_sink_conn
variable during the sink init process
Copy code
self._sink_conn = self.connector._engine.connect()
You can then use that in the functions called in
process_batch
. The only other change was to update
create_temp_from_table
in the mssqlConnector class to take in a connection argument and use it.
Copy code
def create_temp_table_from_table(self, from_table_name, sink_con = None):
        """Temp table from another table."""
        if sink_con is None:
            sink_con = self.connection
it works but I bet there is a better way to do it. I hope this at least gives you some helpful information to work with.
r
@BuzzCutNorman Appreciate you looking into this. Not sure about how "best practice" this is, but I ended up just caching the
connection
property for the time being: https://github.com/Matatika/target-mssql/blob/9abc4d00df1123a1f239d03ea7f4c8cf223b2faf/target_mssql/connector.py#L400-L402 Bit of an empty victory as I don't think we need the version upgrade after all now, but I suppose it's good just for keeping the plugin updated.