@Stéphane Burwash, you would usually try to get away without additional tools, although the dbt metrics do take some of that off your shoulders. As @aaron_phethean points out, you usually are okay with a model/dataset layer and possibly something that allows for more dynamic queries (that's the granularity you're talking about). But honestly, I've never seen the precomputation of metrics as a severe problem. I have, however, witnessed a flood of dashboards that use 1000s of different granularities become a problem. 🙂
If you only precompute, you don't even need dbt metrics. And btw. "recomputing metrics" at every granularity isn't strictly necessary. What I've seen in the past is:
1. Compute it once at the lowest granularity
2. Use an aggregation function to auto-create all other granularities.
Or even:
1. Compute the metric once in a separate model, then use appropriate functions to map them into their final models.
If you have a problematic example, I'm happy to take a look. Would love to finally see an alarming instance of precomputation. (Fwiw, Max doesn't argue in favor of pre-computation, he's OK with views/ dynamic queries as described by @aaron_phethean).