First off, I really appreciate all the feedback and likes related to my post walking through the Optimism dashboard.
With all my new found knowledge on querying L2, I ventured off to apply my knowledge to another L2 ecosystem. Currently on Dune there are 3 other options..xDAI, BSC, and Polygon. I have been a big fan of what Polygon has established, and am a holder of their token so I decided to head that route.
Headed into the queries, I knew right off that Polygon was a completely different beast than Optimism in terms of data size. The Optimism queries had no limits in terms of row counts and time frames because it is a fledgling L2, as opposed to Polygon which has been around for what seems like 2 years. Picking a timeframe to aggregate was going to be completely up to how fast the queries ran.
Scanning through the available tables, I was pleased to see that most of the abstracts that I studied for Optimism were present for Polygon too. Thanks Dune! That included a polygon.transactions, and a polygon.logs tables.
Since Polygon, is a POS chain and not a rollup, I did not expect to see a table expressing batches or elements like Optimism. I will have to wait until Arbitrum to apply that piece.
I started with the transactions table, and as expected my first passthrough the queries were really slow. Dune has a preset timeout of roughly 30 minutes, so my process basically entailed of me hitting the Run button on 4 tabs, leaving for an errand, and checking back 30 minutes later to see how I did. Pretty painstaking, but this is what trial and error looks like.
Eventually I landed on a time frame of 90 days, which I think is sufficient enough to be useful.
In order to establish my 90 day timeframe I used a combination of the now() function and INTERVAL keyword. It read like this in many areas
where “block_time” >= now() -interval ’90 days’
I was able to successful recreate the transactions, and unique user queries with little to no deviations from the Optimism examples.
I knew Polygon was a different beast, but still I was blown away by the transaction volume. ~520M transactions in 90 days. Also, over 2M unique wallets in the same time. That is some major scaling there. Kudos Polygon!
When I got to the gas cost queries, I decided to deviate a little. The gas on Polygon is paid with MATIC tokens as opposed to ETH on Optimism. Since many users might not be in tune with the daily price of MATIC, I decided to convert the gas cost to USD$. On average $0.05 per transaction, is a major breath of fresh air for users.
Next I took a shot at calculating how much WETH had been bridged over to Polygon. I headed to polygonscan to get the contract address for WETH which is below.
contract_address = '\x7ceb23fd6bc0add59e62ac25578270cff1b9f619'
I summed up the burns and mints using the same method in the Optimism example, and began adjusting the timeframe to see how much data I could get returned before the timeout. In the end, I could not even get a single day returned. I was able to get rows to return from the logs table using Limit, but the second I added the date_trunc function to the “block_time” the query screeched to a halt.
I reached out to the Dune team on Discord, and it seems this is a part of the growing pains for a startup. Fine by me, so I shelved the query and will come back to it later.
Here is the final result (also pictured above). Not a ton of data, but still useful.
Visualizing data for an entire ecosystem certainly has its challenges, but it was a positive learning experience in the end. I feel like I will shelve this one for now and maybe come back and add to it later. (Maybe add OpenSea stats?).
Back to browsing Dune to find my next learning experiment.