- Blocks and Growth
- Posts
- How to Actually Learn Web3 Data Analytics in 2026
How to Actually Learn Web3 Data Analytics in 2026
Part 2 of 2: The complete roadmap from zero to earning opportunities

In Part 1, we established the problem. Most Web3 protocols are drowning in data but starving for insights. They track vanity metrics that look good on Twitter but do nothing to drive decisions. The gap between having dashboards and actually using data is wider than most people realize.
But that gap is also an opportunity. If you can learn to ask the right questions and build focused analysis that drives real decisions, you will be in rare company. AI can generate SQL queries all day long. It cannot replace the fundamentals of knowing what to measure and why it matters.
So, how do you actually learn this skill in 2026? That is what we are covering here.
The AI Paradox: Your Best Tool and Worst Teacher
Let us address the elephant in the room. AI tools like Claude and ChatGPT can write better SQL than most junior analysts. They debug queries, optimize performance, and generate complex joins in seconds. So, has the barrier to entry for data analysis disappeared?
Joel was unequivocal. You need to know the fundamentals before you can prompt AI effectively. Without that foundation, he said, “you and the AI will be going back and forth for days.”
Chris shared a story that drove this home. He has been working on a project since last year, and for the last two weeks, the entire team, including AI, has been trying to resolve a single metric. The problem is not writing the query. The problem is defining what the metric should actually measure.
Two weeks. One metric. With AI helping the whole time.
Here is what is really happening. AI is extraordinary at executing well-defined instructions. But it cannot define what you should be measuring or how to calculate it correctly without the context that only you can provide. If you do not understand SQL fundamentals, on-chain data structures, and the protocols you are analyzing, you will spend days going in circles, never knowing if the results are even correct.
The fundamentals are not optional. They are more important than ever.
The Learning Roadmap That Actually Works
Based on our combined experience and mistakes, here is the framework that works for learning data analysis in the age of AI.
Step 1: Master the Fundamentals
Joel broke down what fundamentals actually mean in practice. You need to understand SQL itself. You need in-depth knowledge of the protocol you are analyzing, including every term and every technical aspect. And you need to know the formulas for the metrics you want to derive.
On the SQL side, that means SELECT, WHERE, JOIN, and GROUP BY. It means aggregation functions like COUNT, SUM, and AVG. It means window functions for rankings and running totals, and CTEs for organizing complex queries.

On the on-chain side, you need to understand how blockchain data is organized in tables, what the columns actually represent (addresses, hashes, amounts), how decimal conversions work (those 1018 values you see everywhere), and how to preview tables to understand what data is available.
Then there is protocol knowledge. Read the documentation thoroughly. Understand the formulas. Joel spent a full day figuring out how borrow APY and how APY relates to utilization rates. That kind of deep reading is not optional.
Joel gave a perfect example. If you want to calculate utilization rates and you give AI only the supplier data without including total borrows, the AI will hallucinate a table that does not exist. Then you are stuck in an endless loop. Without the fundamentals, AI becomes a liability instead of an asset.
Step 2: Write Queries by Hand
This is where Chris's advice gets counterintuitive.
“Sometimes, just use your hand and write the query first,” he said. He recalled his early days going through Andrew Hong’s tutorials with a pen and paper, writing SQL line by line.
Writing SQL on paper in 2026, when AI can generate perfect queries instantly? Yes. And here is why it works.
The act of physically writing code forces your brain to process it differently than reading or copying does. It is the same principle copywriters have used for decades. You are not writing queries by hand to be productive. You are doing it to learn. Once you have internalized the patterns, how JOINs work, how to structure aggregations, and how to filter efficiently, then AI becomes a powerful accelerator. Skip this step, and you will forever depend on AI without understanding what it is doing.
Step 3: Learn Metrics One at a Time
Chris introduced what he calls the “metric-by-metric” approach. Today, learn about active users: COUNT DISTINCT from server addresses. Master that, and you can apply it anywhere. Tomorrow, learn how TVL works.
Here is how it plays out in practice. On day one, learn the basic active users query. Understand why DISTINCT matters. Practice filtering by time periods. Learn to segment by user type. On day two, move to Total Value Locked. Understand how it is calculated differently across protocols. Figure out how to account for price changes. Practice aggregating across multiple assets. By the end of your first month, aim to fully understand five core metrics in your niche.
Chris shared his own process. There was a period where he went through every Dune dashboard related to DEXs and wrote out all available metrics anyone could use. He then learned how to write the query for each one. This approach builds knowledge piece by piece. You are not trying to learn everything at once. You are building a library of formulas you truly understand and can apply anywhere.

Step 4: Pick Your Niche and Go Deep
All three of us emphasized this point. Do not try to become an expert on every protocol across every chain.
Joel has specialized in lending protocols and risk analysis. He can explain the exact formulas for calculating liquidation thresholds and articulate why Aave’s approach differs from Compound’s. Chris focuses on payments and smart account abstraction. He understands the specific metrics that matter in that space. I have found my focus in areas where I am actually using the protocols I analyze.
The pattern is clear. Depth beats breadth every time.
When you go deep, you build real expertise instead of surface-level knowledge of everything. You see patterns others miss. You become referable. People say, “Talk to Joel about lending risk,” not “Joel does some data stuff.” And each new protocol in your niche becomes easier to understand because you already know the category.
The key is to pick something you are actually interested in. If you live on Solana using Jupiter every day, start there. If you are fascinated by how payments work, focus there. Your natural curiosity will carry you through the learning curve.
The Content Multiplier Effect
Once you have built real skills and started creating dashboards, the next question is: how do you turn this into opportunities? This is where most analysts fail. Not because their work is bad, but because nobody knows they are doing it.
Chris brought up Alex, an analyst who focuses on Visa and Polygon payments. Alex does not have more than three charts. But he keeps writing articles from just those three charts. Payment data changes daily, so every week brings new insights, new patterns, and new angles. Those three core charts become weekly Twitter threads on volume trends, long-form articles when interesting anomalies appear, monthly reports showing ecosystem growth, and real-time commentary during major announcements.
The structure stays the same. The data refreshes constantly. The insights compound over time.
Compare that to analyzing a mature protocol where not much changes week to week. You might build an excellent dashboard, but you will not have constant content opportunities.
The principle is simple. Pick an active niche where the data changes frequently enough to stay interesting, but is stable enough that your dashboards remain relevant.
High-Velocity vs. Low-Velocity Niches
We identified clear patterns. DeFi trading volume, new protocol launches, payment rails, and NFT markets are high-velocity niches with daily changes and constant opportunities for content. Mature protocols with stable metrics, governance, and infrastructure are low-velocity niches with long-term trends that change slowly.
Neither is better. They are just different. High-velocity niches give you more content opportunities. Low-velocity niches might give you deeper, more strategic consulting work. Choose based on your goals and what actually interests you.
Show Your Work
Chris called Joel and me out directly on this, and he was right.
“When you work hard, please show your work. Nobody is going to know how good you are if you just don’t show it.”
This is where I struggle most. I will spend weeks building a deep analysis, mention it casually to Joel, then weeks later, he will check my Twitter and see nothing. No threads, no posts, no evidence it ever happened. From the outside, it looks like I am doing nothing. And in the reputation economy of Web3, if people do not know you are doing good work, you might as well not be doing it.
Why is this so hard? The work never feels perfect. Someone else probably did it better. What if you are wrong? What if it is under NDA? All valid-sounding reasons, and all of them are excuses.
The Web3 community is remarkably supportive of people doing good work publicly. Yes, someone might criticize your analysis. That is actually valuable. It is how you get better. The alternative is silence, which helps nobody.
Showing your work does not need to be complicated. Share your dashboard on Twitter with three to five key insights. Write a blog post exploring one interesting finding. Screen-record yourself walking through a dashboard. Make your work public on Dune and let others fork and learn from your queries. The format matters less than the consistency. Pick one and commit to it.
The Resource Limitations Nobody Talks About
Joel brought up a practical constraint that trips up everyone: platform limitations.
Free Dune accounts get 2,500 credits. Analyzing Aave across all EVM chains requires thousands of credits per query. Comparing Aave to every other lending protocol with real detail? You are looking at a Plus account minimum ($390/year), possibly Premium ($3,900/year) if you are refreshing regularly.
Your imagination might exceed what is feasible with free tools. Comprehensive cross-chain analysis is expensive. But that is not a reason to give up. It is a constraint that forces you to be more thoughtful about what you analyze and why.
Start with one chain, prove the value, then expand. Optimize queries aggressively by using indexed tables, filtering early, and limiting results. Use cached queries from others when possible. If you are serious about this as a career, paid accounts are worth the investment.

What We Would Tell Our Younger Selves
Toward the end of our conversation, I asked each of us to distill everything into one piece of advice.
Joel: Master the fundamentals. Do not skip the basics to chase AI shortcuts. You will pay for it later when you cannot tell if your results are correct, and you cannot iterate without going back to AI every single time.
Chris: Train like an athlete. If you want to play this game for the long term, you need to train hard. Learn SQL again. Write things down by hand if you have to. Build your knowledge brick by brick. And remember that blockchain is open source. Do not just learn existing metrics. Invent new ones when the situation calls for it.
My take: Follow your curiosity. Do not force yourself to analyze protocols you do not care about or chains you do not use. Your curiosity drives your learning. The questions you naturally ask become the queries you write. The dashboards you build solve problems you actually care about, which means other people probably care too. It is a marathon, not a sprint.
Your Next Steps
Start with the fundamentals. Pick Andrew Hong’s Dune Analytics tutorials or the Wizard Analytics Guide and work through them with pen and paper. Do not move to the next lesson until you understand the current one.
Then choose your niche. What protocols do you actually use? What data makes you curious? Where can you see yourself going deep for months or years?
Learn one metric at a time. Today, active users. This week, TVL or revenue or whatever matters most in your niche. Build and share. Create your first dashboard. It will be rough, and that is fine. Share it publicly with what you learned. Iterate based on feedback.
And show your work consistently. Pick a format. Commit to sharing at least once a week. Build in public and document your progress.
Listen to the full episode where we went deeper on specific formulas, the Ethena collapse, and the exact tools we use daily. Available wherever you listen to podcasts.
Watch and Listen to the full podcast episode on YouTube.
Watch and Listen to the full podcast episode on Spotify.
Resources mentioned in this series:
Andrew Hong’s Dune Analytics tutorials (YouTube),
Sam’s Dune Analytics channel,
Protocol documentation (always start here),
Dune Analytics (free account to start).
If you found this valuable, share it with someone trying to break into data or a protocol team that needs to level up their analytics game
If you’re new here, subscribe so you don’t miss out.