Dune-Layer 2
--
The queries for this post can be found here — https://dune.xyz/queries/153902
After spending some time in NFT land, I am thirsty to learn about something new.
A hot topic lately has been the release of Ethereum Layer 2 protocols, and I found a very nice dashboard that seems perfect to dive into.
The dashboard is called Optimistic-Ethereum and it’s author is “@Marcov”
Before I dive in, I wanted to explain a little about my learning process and why I’m writing here. After reviewing the queries from the example dashboard, I am then writing them from scratch on my own. I want to make sure I can get the query to execute the same as the author and it gives me an opportunity to practice and work through bugs. Writing the explanation’s out on Medium just further ingrains my understanding, and also is meaningful to me if it helps even one person out. I always will encourage actually writing the queries as opposed to reading about them and assuming you understand.
OK, let’s dive in.
Total Transactions per day-Similar to the ethereum.”transactions” table, there is a matching table called optimism.”transactions”. Very convenient. Once you locate this table the calculations for transaction counts are pretty basic SQL.
select
date_trunc(‘day’,”block_time”) as “day”
, count(*)
from optimism.”transactions”
group by date_trunc(‘day’,”block_time”)
order by “day” asc
Unique Users Per Day-This section utilizes the same table as transactions query above, but instead of counting the rows in the transactions table, we will count the distinct(unique) address in the “from” column. To make sure we are not double dipping users on multiple days(hence unique users), we first select the min(block_time) grouping by the “from” to get only the first day a new address used Optimism. We can then sum the users for each day, and finally use the over clause to make this sum cumulative.
with first_tx as (
select min(“block_time”) as “min_block_time”
, “from”
from optimism.”transactions”
group by “from”),
users as (select date_trunc(‘day’,”min_block_time”) as “day”
, count(*) as users
from first_tx
group by date_trunc(‘day’,”min_block_time”))
select sum(users) over (order by “day”) as “cum_users”
, “day”
from users
Average Transaction Cost-Again the same table as transactions and similar to transaction count and unique users, but this time calculating the ETH gas amount which is the formula (gas_limit*gas_cost)/10¹⁸. The author has also included the average cost for a Uniswap transaction which is the same query but adding a WHERE “to”=Uniswap contract address.
select
avg((“gas_limit”*”gas_price”)/10¹⁸) as eth_gas_price
, date_trunc(‘day’,”block_time”) as “day”
from optimism.”transactions”
group by date_trunc(‘day’,”block_time”)
I kind of wondered why the author did not include a USD conversion so I gave it a shot. The prices.usd table is not available yet in the Optimism database so I had to dig for an alternate. I did find a table called dune_user_generated.”uniswap_prices” which seemed promising, and in fact it worked. I built the subselect below and joined based on day to calculate USD price of the gas.
SELECT max(price) as price
, “day”
from dune_user_generated.”uniswap_prices”
where “symbol”=’ETH’
group by “day”
ETH on Optimism-Now this gets a little more interesting. In this query, we are using a table called optimism.logs. The author utilizes a “block_time”, data, topic, and contract address fields.
The data field represents the value of the transaction, and is divided by 10¹⁸ to get the ETH value. The data field is also converted to a “byte2numeric” type which is necessary to perform the division calculation since you can’t divide a “bytea” field.
The topic fields are where the author differentiates from a ETH burn to ETH mint. My understanding of L2 function, is when you bridge your assets the assets get temporarily burned on the L1 chain, and minted on the L2 chain, and when you exit the opposite occurs. When “topic3”=burn address then the ETH is being burned on L2. Then “topic2"=burn address then the ETH is being minted on L2. The “data” field is (-) negative for burns and (+) positive for mints.
The contract address in the where clause represents the ETH contract address on Optimism. To confirm this I used the Optimism version of Etherscan (fun!) https://optimistic.etherscan.io.
When you add the mint and burn transactions together you get the resulting ETH left on Optimism.
with mints as (
select bytea2numeric(“data”)/1e18 as “eth”
, date_trunc(‘day’,”block_time”) as “day”
from optimism.”logs”
where “contract_address”= ‘\x4200000000000000000000000000000000000006’ — ETH
and “topic2”= ‘\x0000000000000000000000000000000000000000000000000000000000000000’
)
, burns as (
select -1 * bytea2numeric(“data”)/1e18 as “eth”
, date_trunc(‘day’,”block_time”) as “day”
from optimism.”logs”
where “contract_address”= ‘\x4200000000000000000000000000000000000006’ — ETH
and “topic3”= ‘\x0000000000000000000000000000000000000000000000000000000000000000’
Total Elements- This was a new concept for me. Ws not sure what an “element” was.
From a brief study of the tables used it appears that each “batch” transaction contains a variable number of “elements”. I’m not exactly sure if this is right, but I will assume that each element represents a single transaction, which are rolled into batches, and submitted to the L1 chain as rollups.
The table used in these objects is unique in that it does not show up if you search for it in the explorer. It also has a freakishly long name.
OVM_CanonicalTransactionChain_evt_TransactionBatchAppended — This table has each individual batch, with it’s corresponding number of elements. There is a column titled _prevTotalElements which is a running total of elements in each batch.
To show Total Elements you simply just select the first row in the table ordered by evt_block_time
select “_prevTotalElements”
from optimism.”OVM_CanonicalTransactionChain_evt_TransactionBatchAppended”
order by evt_block_time desc
limit 1
To show a cumulative sum of elements, you can aggregate the maximum value across each day, with no need to sum() over () because as I mentioned the column is already cumulative.
select max(“_prevTotalElements”) as “cum_elements”
, date_trunc(‘day’,”evt_block_time”) as “day”
from optimism.”OVM_CanonicalTransactionChain_evt_TransactionBatchAppended”
group by date_trunc(‘day’,”evt_block_time”)
order by date_trunc(‘day’,”evt_block_time”) asc
Batch Size-This measurement also comes from the OVM_Canonical table, and is measured in the column “_batchSize”. This is not a cumulative column, and based on the data currently in this field the max value is 251. I believe this represent the total size of the transactions(“elements”) that can be included in each batch which is supposed to increase over time as Optimism moves beyond it’s beta phase.
Time Between Batches-Again we are utilizing the Canonical table. This calculation uses a “new to me” function called “lag” which allows you to select from the previous row of data dynamically (pretty cool). The author subtracts the evt_block_time-lag(evt_block_time) over (order by evt_block_time asc) to get the time between each batch. The author then includes the “day” of the transactions, and then aggregates and average “batch_time” for each day.
with batches as (select date_trunc(‘day’,”evt_block_time”) as “day”
, “evt_block_time” — lag(“evt_block_time”) over (order by “evt_block_time” asc) as “batch_time”
from optimism.”OVM_CanonicalTransactionChain_evt_TransactionBatchAppended”)
select “day”
, avg(“batch_time”) as “avg_batch_time”
from batches
group by “day”
order by “day” asc
That about sums it up. My philosophy has always been that you really don’t know until you try. Getting in there and actually doing the queries is where I gain my most valuable lessons.
Note: While using “Run by Selection” I kept getting the error below. Not sure if this is unique to the Optimism schema or even to the OVM table. When I copied the queries to a “New query” and ran the entire block they worked fine. Not sure why this was happening. Error: Relation “optimism.OVM_CanonicalTransactionChain_evt_TransactionBatchAppended” does not exist at line 3, position