This article is just a simple sum up of the questions and answers of the AMA #4 on Reddit. We’ll do this continuously for simplification and easier understanding with additional context. AMA source here. It starts with the easier questions and gets more technical to the end. Questions are all answered by the AVA-Team. (/u/el33th4xor, /u/sekniqi, /u/StephenTechSupport, /u/Tederminant, /u/ccusce, /u/avalabsdan, /u/avawings)
Before you continue, join the AVA subreddit: https://www.reddit.com/r/ava/
#1: Will Ava have a single currency behind it that will go on exchanges? Or is it just a tech stack?
The root currency for the AVA platform is…well… AVA :-) The thing is, AVA is also a platform for new asset classes, and many of those we foresee being listed on exchanges themselves.
#2 How AVA nodes get updated? is there auto-updater or it must be done manually?
Right now AVA nodes must be updated manually (git pull and then build). We’re working on improving the build process.
#3 Athereum is running, right? How easy is the process to get Ethereum Dapp running on Athereum?
We have not launched Athereum yet, but we have launched the C-Chain (Contract chain.) The C-Chain is a new, (originally) empty instance of the EVM. Athereum will have all of the state of Ethereum up to a certain, not-yet-determined block height.
Your tooling, etc. work on the C-Chain just like with Ethereum. See this tutorial on launching a smart contract with remix and metamask.
#4 How do AVA intend to control inflation and deflation?
$AVA has a capped-supply of 720,000,000 (720M) tokens. The genesis block will have 360M 75 $AVA tokens. The rest of the 360M tokens will be minted according to an Equation. For a graphical representation, Figure 2 shows the token emissions curve between $AVA and BTC. The principle of the emissions function chosen for AVA is simple: reach a capped supply, in a fashion similar to Bitcoin’s emissions curve, and yet maintain the ability to govern the rate at which the system reaches this limit.
Check out the token paper for the equation and minting process here: https://files.avalabs.org/papers/ava-token.pdf
#5 Can you say ahead of time, what new components, tools, and libraries may become available with the emergence of Denali?
Additional input to describe what Denali is: The Denali test network will be the last round of testing before launching the AVA mainnet. It will feature many new components, tools, and libraries for developers.
Our website has a roadmap (https://www.avalabs.org/roadmap), and on the bottom there are a list of features we’re trying to get in. At the moment, NFTs on the X-Chain, transaction fees, and some DDoS are in the talks, but no promises for what’s in or out yet.
#6 Any chance NFTs will be ready for testing before mainnet?
Yup! They’re technically in the client now, but I’m still working on upgrading slopes to align better with gecko’s transaction parser and test out the new transaction type. But it is there, if you check the transaction format docs!
Additional context to what NFTs are: An NFT token can digitize assets that are unique pieces and that token is equally as unique. An NFT token then becomes desirable and unique, as opposed to all ETH and BTC being equal.
#7 Have you started doing reachout for AVAX? Can you tell us more about the upcoming bounties? Anything interesting for more high-level devs/contributors?
We have some very exciting announcements in the pipeline, but for now we’d like to keep it under wraps until absolutely ready to release. I will say, we’ve hundreds of bounty hunters and we are working with AVA-X project propsers every day. I think you’ll like what we have to share (when we’re ready to share it).
#8 In a recent AMA, /u/el33th4xor mentioned that AVA ‘can save archival requirements for the next 50 years’. I suspect you must have a novel approach given AVA can run thousands of transactions per second. Could you elaborate on this?
My guess is that Gün was talking about pruning. Most nodes don’t want to and/or aren’t able to store the full blockchain. Storing the full blockchain should be done using specialized tools that can store terabytes to petabytes of data.