I try to test as much as possible locally.
Everything depends on what target(s) do you use.
If you use a target, which you can start locally (e.g. PostgreSQL or DuckDB), it's easy (and much faster than testing against a cloud service).
If you use e.g. Snowflake, then you can test locally with DuckDB, but it may fail later when running against Snowflake. But still worth doing, IMO.
Moreover, if e.g. dbt transforms what Meltano produces, you can test it locally too. Here you can utilize Jinja macros to make your SQL models more less agnostic to DB engine.
Regarding how to test locally - I prefer to setup a docker-compose starting permanent services (e.g. PostgreSQL) and allowing executing adhoc services (e.g. extract_load powered by Meltano).
As an alternative, for developer convenience, you can provide a Makefile with the same tasks. Developers can iterate faster...
I built a "blueprint" data pipeline here, it's open source, feel free to reuse anything 😉
https://github.com/jaceksan/gooddata-data-pipeline