Crypto Mining: Network Difficulty, Share Difficulty and ...
Crypto Mining: Network Difficulty, Share Difficulty and ...
What is Share and the Share Difficulty When You Are Mining ...
Bitcoin Mining Difficulty - What is it And How Does it ...
Mining — Bitcoin - Getting Started — Bitcoin
Mining Difficulty and Network Hashrate Explained - Crypto ...
Cyptocurrency pegged to electricity price
Meter.ioaims to create a low volatile currency following 10 kwh electricity price. Meter uses a hybrid PoW/PoS solution; PoW mining for stable coin creation and PoS for txn ordering
MTR is stablecoin soft pegged around the global competitive price of 10 kwh electricity
MTRG is the finite supply governance token, which is used by PoS validators to validate transactions.
Pow mining in Meter is as open and decentralized as in Bitcoin but differs from that in Bitcoin in two fundamental ways
Block rewards are dynamic. It’s determined as a function of pow difficulty. The wining Meter miner will earn more MTR if hash rate is high and less MTR if hash rate is low, ensuring a stable cost of production for each MTR at 10 kWh electricity price using mainstream mining equipment
Miner’s don’t validate transactions. They simply compete to solve PoW. Txn ordering is done by PoS validators who secure the network and in return earn txn fees.
All stablecoins must essentialy have stability mechanisms to account for cases where demand is high and where demand is low. MTR has 2 stability mechanisms set to solve this mission. Supply side stability mechanism (long term) First and foremost MTR can’t be produced out of thin air. It’s issuance follows a disciplined monetary policy that solely depends on profit seeking behavior of miners. The only way to issue MTR is via PoW mining. When miners notice that price of MTR is getting higher than the cost to produce them (remember cost of production is always fixed at 10 kwh elec. price = around 0.9-1.2 usd) they will turn on their equipment and start creating new supply. If demand keeps increasing more miners will join, and more MTR will be printed to keep up with demand. Eventually supply will outperfrom the demand and price will get back to equilibrium. When demand is low and MTR price is dropping below 10 kwh elec. price miners will not risk their profit margin to shrink and switch to mine other coins instead of MTR. In return MTR production will stop and no additional MTR will enter circulation. Given that mining is a competitive, open enviroment, price of MTR will eventually equal to the cost to produce it. (Marginal Revenue = Marginal Cost). The long term stability is achieved through this unique and simple mechanism at layer 1 which doesn’t require use of capital inefficient collateral, complicated oracles, seignorage shares or algorithmic rebasing mechanisms. Relative to nation based fiat currencies, switching cost between crytocurrencies is significantly lower. Sudden demand changes in crypto is therefore very common and must be addressed. Huge drop in demand may temporarly cause MTR to get traded below it’s cost of production making pow mining a losing game. How can the system recover from that and restart production? On the contrary, a sudden increase in demand may cause MTR to get traded at a premium making mining temporarly very profitable. Meter has a second layer stability mechanism in order to absorb sudden demand changes. Demand side stability mechanism (short term) An on chain auction (will become live in October 2020) resets every 24 hours offering newly minted fixed number of MTRGs in exchange for bids in MTR. Participants bid at no specific price and at the end of auction recieve MTRG proportional to their percentage of total bid. The main purpose of this auction is to consume MTR. A portion of MTR (initally %60) that is bidded in the auction ends up going to a reserve that is collectively owned by MTRG holders, essentially getting out of circulation. Future use of MTR in Reserve can be decided by governance. The remaining %40 gets gradually distributed to PoS validators as block rewards. This reserve allocation ratio can be adjusted via governance depending on the amount of MTR needed to be removed out of circulation at any point in time. Meter team working to make Meter compatible with other blockchain. In fact both MTR and MTRG can currently be 1:1 bridged to their Ethereum versions as eMTR and eMTRG respectively. In near term, stablecoin MTR is set out on a mission to serve as collateral and a crypto native unit of account for DeFi.
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
Technology and some more:
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
Down the rabbit hole
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here. Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017. Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand. Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”.Scilla design story part 1
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
“Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
Smart contract on a sharded environment and state sharding
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
Business & Partnerships
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
Marketing & Community
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
AITD class lesson 4: Mining will bring fortune, Consensus will generate value faster.
As we all know, Bitcoin is a decentralized digital currency, there is no central currency publisher. The question is that how first Bitcoin was generated if there are no publishers for Bitcoin. The answer is: Through Mining. https://preview.redd.it/ebm9zfvbotr51.jpg?width=640&format=pjpg&auto=webp&s=798802bf75101284a9e6b111a66188c2b9b78d2a In January of 2009, Bitcoin father Satoshi nakamoto got the First Genesis Block through data mining on Bitcoin network and received 50 Bitcoins as rewards, therefore, first set of Bitcoins was official revealed. Since then, more and more Bitcoin mining labors started mining,as a result, they got tones of Bitcoins. Mining is not like real mining in Bitcoin, it depends on the consensus algorithm from Bitcoin networks, using mining machine continuously to calculate Block password. The mining machine which gets the correct answer will unlock Blocks and get Bitcoins ( In the Block) as rewards. The whole mining processes are kind of like purchase lottery, people who gets more numbers will get higher chance to match the winning number; The differences are that there are no second position prizes or third position prizes and people can not share prizes with other people. After explaining mining, let's take a look at consensus algorithm we mention earlier. The whole mining processes will count on consensus algorithm to process, we can consider consensus algorithm as “ Mining rules". Let's take Bitcoin as an example. POW algorithm used in Bitcoin network can be simply considering as contribution force algorithm . The algorithm requires that every single node has the right to start mining, it is the computing processes for Block password; In Blockchain Industry, it usually consider continuously computing processes for nodes mining as Hashrate contribution, unlock Blocks, gaining Bitcoin rewards possibilities will be higher if node's Hashrate contribution is higher. In the beginning period of Bitcoin, the difficulty is not high for mining, mining labor can unlocked Blocks easily and got rewards, as Bitcoin consensus is getting stronger, more and more people are starting joining mining Bitcoins. Difficulty will be higher if there are more competitors, Hashrate consumed by mining machine will be higher, at the same time, due to output has been cut into half for Bitcoin blocks, the profits from mining are continuously compressing, the Hashrate resources waste caused by POW algorithm was causing criticism from public. According to the developing Blockchain techniques, members within the industry are looking for low cost mining mode, therefore, POS, DPOS, POC algorithms started continuously appearing. These algorithms are getting ride of contribution forces algorithm from Bitcoin, building "Gaining Bitcoin will generate revenue""Small amount of witnesses are starting mining for blocks", "Disc capacity prove" multiple mining methods, enabling digital currency to apply in various scenarios.( Detail transformation processes will be explained in the next episode.) Currently, Consensus algorithm is transforming to simplify procedures, reducing resources usage direction, Only improved algorithm mechanism will get strong consensus. In the future, there will be many more algorithms appearing in the market, mining procedure will be simpler, fair, energy reduced. Next episode preview: The transformation path for consensus algorithm mechanism.
https://github.com/gridcoin-community/Gridcoin-Research/releases/tag/22.214.171.124 Finally! After over ten months of development and testing, "Fern" has arrived! This is a whopper. 240 pull requests merged. Essentially a complete rewrite that was started with the scraper (the "neural net" rewrite) in "Denise" has now been completed. Practically the ENTIRE Gridcoin specific codebase resting on top of the vanilla Bitcoin/Peercoin/Blackcoin vanilla PoS code has been rewritten. This removes the team requirement at last (see below), although there are many other important improvements besides that. Fern was a monumental undertaking. We had to encode all of the old rules active for the v10 block protocol in new code and ensure that the new code was 100% compatible. This had to be done in such a way as to clear out all of the old spaghetti and ring-fence it with tightly controlled class implementations. We then wrote an entirely new, simplified ruleset for research rewards and reengineered contracts (which includes beacon management, polls, and voting) using properly classed code. The fundamentals of Gridcoin with this release are now on a very sound and maintainable footing, and the developers believe the codebase as updated here will serve as the fundamental basis for Gridcoin's future roadmap. We have been testing this for MONTHS on testnet in various stages. The v10 (legacy) compatibility code has been running on testnet continuously as it was developed to ensure compatibility with existing nodes. During the last few months, we have done two private testnet forks and then the full public testnet testing for v11 code (the new protocol which is what Fern implements). The developers have also been running non-staking "sentinel" nodes on mainnet with this code to verify that the consensus rules are problem-free for the legacy compatibility code on the broader mainnet. We believe this amount of testing is going to result in a smooth rollout. Given the amount of changes in Fern, I am presenting TWO changelogs below. One is high level, which summarizes the most significant changes in the protocol. The second changelog is the detailed one in the usual format, and gives you an inkling of the size of this release.
Note that the protocol changes will not become active until we cross the hard-fork transition height to v11, which has been set at 2053000. Given current average block spacing, this should happen around October 4, about one month from now. Note that to get all of the beacons in the network on the new protocol, we are requiring ALL beacons to be validated. A two week (14 day) grace period is provided by the code, starting at the time of the transition height, for people currently holding a beacon to validate the beacon and prevent it from expiring. That means that EVERY CRUNCHER must advertise and validate their beacon AFTER the v11 transition (around Oct 4th) and BEFORE October 18th (or more precisely, 14 days from the actual date of the v11 transition). If you do not advertise and validate your beacon by this time, your beacon will expire and you will stop earning research rewards until you advertise and validate a new beacon. This process has been made much easier by a brand new beacon "wizard" that helps manage beacon advertisements and renewals. Once a beacon has been validated and is a v11 protocol beacon, the normal 180 day expiration rules apply. Note, however, that the 180 day expiration on research rewards has been removed with the Fern update. This means that while your beacon might expire after 180 days, your earned research rewards will be retained and can be claimed by advertising a beacon with the same CPID and going through the validation process again. In other words, you do not lose any earned research rewards if you do not stake a block within 180 days and keep your beacon up-to-date. The transition height is also when the team requirement will be relaxed for the network.
Besides the beacon wizard, there are a number of improvements to the GUI, including new UI transaction types (and icons) for staking the superblock, sidestake sends, beacon advertisement, voting, poll creation, and transactions with a message. The main screen has been revamped with a better summary section, and better status icons. Several changes under the hood have improved GUI performance. And finally, the diagnostics have been revamped.
The wallet sync speed has been DRASTICALLY improved. A decent machine with a good network connection should be able to sync the entire mainnet blockchain in less than 4 hours. A fast machine with a really fast network connection and a good SSD can do it in about 2.5 hours. One of our goals was to reduce or eliminate the reliance on snapshots for mainnet, and I think we have accomplished that goal with the new sync speed. We have also streamlined the in-memory structures for the blockchain which shaves some memory use. There are so many goodies here it is hard to summarize them all. I would like to thank all of the contributors to this release, but especially thank @cyrossignol, whose incredible contributions formed the backbone of this release. I would also like to pay special thanks to @barton2526, @caraka, and @Quezacoatl1, who tirelessly helped during the testing and polishing phase on testnet with testing and repeated builds for all architectures. The developers are proud to present this release to the community and we believe this represents the starting point for a true renaissance for Gridcoin!
Most significantly, nodes calculate research rewards directly from the magnitudes in EACH superblock between stakes instead of using a two- or three- point average based on a CPID's current magnitude and the magnitude for the CPID when it last staked. For those long-timers in the community, this has been referred to as "Superblock Windows," and was first done in proof-of-concept form by @denravonska.
Network magnitude unit pinned to a static value of 0.25
Max research reward allowed per block raised to 16384 GRC (from 12750 GRC)
New CPIDs begin accruing research rewards from the first superblock that contains the CPID instead of from the time of the beacon advertisement
500 GRC research reward limit for a CPID's first stake
6-month expiration for unclaimed rewards
10-block spacing requirement between research reward claims
Rolling 5-day payment-per-day limit
Legacy tolerances for floating-point error and time drift
The need to include a valid copy of a CPID's magnitude in a claim
10-block emission adjustment interval for the magnitude unit
One-time beacon activation requires that participants temporarily change their usernames to a verification code at one whitelisted BOINC project
Verification codes of pending beacons expire after 3 days
Self-service beacon removal
Burn fee for beacon advertisement increased from 0.00001 GRC to 0.5 GRC
Rain addresses derived from beacon keys instead of a default wallet address
Beacon expiration determined as of the current block instead of the previous block
The ability for developers to remove beacons
The ability to sign research reward claims with non-current but unexpired beacons
As a reminder:
Beacons expire after 6 months pass (180 days)
Beacons can be renewed after 5 months pass (150 days)
Renewed beacons must be signed with the same key as the original beacon
Magnitudes less than 1 include two fractional places
Magnitudes greater than or equal to 1 but less than 10 include one fractional place
A valid superblock must match a scraper convergence
Superblock popularity election mechanics
Yes/no/abstain and single-choice response types (no user-facing support yet)
To create a poll, a maximum of 250 UTXOs for a single address must add up to 100000 GRC. These are selected from the largest downwards.
Burn fee for creating polls scaled by the number of UTXOs claimed
50 GRC for a poll contract
0.001 GRC per claimed UTXO
Burn fee for casting votes scaled by the number of UTXOs claimed
0.01 GRC for a vote contract
0.01 GRC to claim magnitude
0.01 GRC per claimed address
0.001 GRC per claimed UTXO
Maximum length of a poll title: 80 characters
Maximum length of a poll question: 100 characters
Maximum length of a poll discussion website URL: 100 characters
Maximum number of poll choices: 20
Maximum length of a poll choice label: 100 characters
Magnitude, CPID count, and participant count poll weight types
The ability for developers to remove polls and votes
[126.96.36.199] 2020-09-03, mandatory, "Fern"
Backport newer uint256 types from Bitcoin #1570 (@cyrossignol)
Implement project level rain for rainbymagnitude #1580 (@jamescowens)
Upgrade utilities (Update checker and snapshot downloadeapplication) #1576 (@iFoggz)
Provide fees collected in the block by the miner #1601 (@iFoggz)
Add support for generating legacy superblocks from scraper stats #1603 (@cyrossignol)
Port of the Bitcoin Logger to Gridcoin #1600 (@jamescowens)
Implement zapwallettxes #1605 (@jamescowens)
Implements a global event filter to suppress help question mark #1609 (@jamescowens)
Add next target difficulty to RPC output #1615 (@cyrossignol)
Add caching for block hashes to CBlock #1624 (@cyrossignol)
Make toolbars and tray icon red for testnet #1637 (@jamescowens)
Add an rpc call convergencereport #1643 (@jamescowens)
Implement newline filter on config file read in #1645 (@jamescowens)
Implement beacon status icon/button #1646 (@jamescowens)
Add gridcointestnet.png #1649 (@caraka)
Add precision to support magnitudes less than 1 #1651 (@cyrossignol)
Replace research accrual calculations with superblock snapshots #1657 (@cyrossignol)
Publish example gridcoinresearch.conf as a md document to the doc directory #1662 (@jamescowens)
Add options checkbox to disable transaction notifications #1666 (@jamescowens)
Add support for self-service beacon deletion #1695 (@cyrossignol)
Add support for type-specific contract fee amounts #1698 (@cyrossignol)
Add verifiedbeaconreport and pendingbeaconreport #1696 (@jamescowens)
Add preliminary testing option for block v11 height on testnet #1706 (@cyrossignol)
Add verified beacons manifest part to superblock validator #1711 (@cyrossignol)
Implement beacon, vote, and superblock display categories/icons in UI transaction model #1717 (@jamescowens)
How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?
https://preview.redd.it/n5jzxozn27v51.png?width=2222&format=png&auto=webp&s=6cd6bd726582bbe2c595e1e467aeb3fc8aabe36f On October 20, Eric Yao, Head of EpiK China, and Leo, Co-Founder & CTO of EpiK, visited Deep Chain Online Salon, and discussed “How EpiK saved the miners eliminated by Filecoin by launching E2P storage model”. ‘?” The following is a transcript of the sharing. Sharing Session Eric: Hello, everyone, I’m Eric, graduated from School of Information Science, Tsinghua University. My Master’s research was on data storage and big data computing, and I published a number of industry top conference papers. Since 2013, I have invested in Bitcoin, Ethereum, Ripple, Dogcoin, EOS and other well-known blockchain projects, and have been settling in the chain circle as an early technology-based investor and industry observer with 2 years of blockchain experience. I am also a blockchain community initiator and technology evangelist Leo: Hi, I’m Leo, I’m the CTO of EpiK. Before I got involved in founding EpiK, I spent 3 to 4 years working on blockchain, public chain, wallets, browsers, decentralized exchanges, task distribution platforms, smart contracts, etc., and I’ve made some great products. EpiK is an answer to the question we’ve been asking for years about how blockchain should be landed, and we hope that EpiK is fortunate enough to be an answer for you as well. Q & A Deep Chain Finance: First of all, let me ask Eric, on October 15, Filecoin’s main website launched, which aroused everyone’s attention, but at the same time, the calls for fork within Filecoin never stopped. The EpiK protocol is one of them. What I want to know is, what kind of project is EpiK Protocol? For what reason did you choose to fork in the first place? What are the differences between the forked project and Filecoin itself? Eric: First of all, let me answer the first question, what kind of project is EpiK Protocol. With the Fourth Industrial Revolution already upon us, comprehensive intelligence is one of the core goals of this stage, and the key to comprehensive intelligence is how to make machines understand what humans know and learn new knowledge based on what they already know. And the knowledge graph scale is a key step towards full intelligence. In order to solve the many challenges of building large-scale knowledge graphs, the EpiK Protocol was born. EpiK Protocol is a decentralized, hyper-scale knowledge graph that organizes and incentivizes knowledge through decentralized storage technology, decentralized autonomous organizations, and generalized economic models. Members of the global community will expand the horizons of artificial intelligence into a smarter future by organizing all areas of human knowledge into a knowledge map that will be shared and continuously updated for the eternal knowledge vault of humanity And then, for what reason was the fork chosen in the first place? EpiK’s project founders are all senior blockchain industry practitioners and have been closely following the industry development and application scenarios, among which decentralized storage is a very fresh application scenario. However, in the development process of Filecoin, the team found that due to some design mechanisms and historical reasons, the team found that Filecoin had some deviations from the original intention of the project at that time, such as the overly harsh penalty mechanism triggered by the threat to weaken security, and the emergence of the computing power competition leading to the emergence of computing power monopoly by large miners, thus monopolizing the packaging rights, which can be brushed with computing power by uploading useless data themselves. The emergence of these problems will cause the data environment on Filecoin to get worse and worse, which will lead to the lack of real value of the data in the chain, high data redundancy, and the difficulty of commercializing the project to land. After paying attention to the above problems, the project owner proposes to introduce multi-party roles and a decentralized collaboration platform DAO to ensure the high value of the data on the chain through a reasonable economic model and incentive mechanism, and store the high-value data: knowledge graph on the blockchain through decentralized storage, so that the lack of value of the data on the chain and the monopoly of large miners’ computing power can be solved to a large extent. Finally, what differences exist between the forked project and Filecoin itself? On the basis of the above-mentioned issues, EpiK’s design is very different from Filecoin, first of all, EpiK is more focused in terms of business model, and it faces a different market and track from the cloud storage market where Filecoin is located because decentralized storage has no advantage over professional centralized cloud storage in terms of storage cost and user experience. EpiK focuses on building a decentralized knowledge graph, which reduces data redundancy and safeguards the value of data in the distributed storage chain while preventing the knowledge graph from being tampered with by a few people, thus making the commercialization of the entire project reasonable and feasible. From the perspective of ecological construction, EpiK treats miners more friendly and solves the pain point of Filecoin to a large extent, firstly, it changes the storage collateral and commitment collateral of Filecoin to one-time collateral. Miners participating in EpiK Protocol are only required to pledge 1000 EPK per miner, and only once before mining, not in each sector. What is the concept of 1000 EPKs, you only need to participate in pre-mining for about 50 days to get this portion of the tokens used for pledging. The EPK pre-mining campaign is currently underway, and it runs from early September to December, with a daily release of 50,000 ERC-20 standard EPKs, and the pre-mining nodes whose applications are approved will divide these tokens according to the mining ratio of the day, and these tokens can be exchanged 1:1 directly after they are launched on the main network. This move will continue to expand the number of miners eligible to participate in EPK mining. Secondly, EpiK has a more lenient penalty mechanism, which is different from Filecoin’s official consensus, storage and contract penalties, because the protocol can only be uploaded by field experts, which is the “Expert to Person” mode. Every miner needs to be backed up, which means that if one or more miners are offline in the network, it will not have much impact on the network, and the miner who fails to upload the proof of time and space in time due to being offline will only be forfeited by the authorities for the effective computing power of this sector, not forfeiting the pledged coins. If the miner can re-submit the proof of time and space within 28 days, he will regain the power. Unlike Filecoin’s 32GB sectors, EpiK’s encapsulated sectors are smaller, only 8M each, which will solve Filecoin’s sector space wastage problem to a great extent, and all miners have the opportunity to complete the fast encapsulation, which is very friendly to miners with small computing power. The data and quality constraints will also ensure that the effective computing power gap between large and small miners will not be closed. Finally, unlike Filecoin’s P2P data uploading model, EpiK changes the data uploading and maintenance to E2P uploading, that is, field experts upload and ensure the quality and value of the data on the chain, and at the same time introduce the game relationship between data storage roles and data generation roles through a rational economic model to ensure the stability of the whole system and the continuous high-quality output of the data on the chain. Deep Chain Finance: Eric, on the eve of Filecoin’s mainline launch, issues such as Filecoin’s pre-collateral have aroused a lot of controversy among the miners. In your opinion, what kind of impact will Filecoin bring to itself and the whole distributed storage ecosystem after it launches? Do you think that the current confusing FIL prices are reasonable and what should be the normal price of FIL? Eric: Filecoin mainnet has launched and many potential problems have been exposed, such as the aforementioned high pre-security problem, the storage resource waste and computing power monopoly caused by unreasonable sector encapsulation, and the harsh penalty mechanism, etc. These problems are quite serious, and will greatly affect the development of Filecoin ecology. These problems are relatively serious, and will greatly affect the development of Filecoin ecology, here are two examples to illustrate. For example, the problem of big miners computing power monopoly, now after the big miners have monopolized computing power, there will be a very delicate state — — the miners save a file data with ordinary users. There is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. And after the big miners have monopolized computing power, there will be a very delicate state — — the miners will save a file data with ordinary users, there is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. Because I can fake another identity to upload data for myself, but that leads to the fact that for any miner I go to choose which data to save. I have only one goal, and that is to brush my computing power and how fast I can brush my computing power. There is no difference between saving other people’s data and saving my own data in the matter of computing power. When I save someone else’s data, I don’t know that data. Somewhere in the world, the bandwidth quality between me and him may not be good enough. The best option is to store my own local data, which makes sense, and that results in no one being able to store data on the chain at all. They only store their own data, because it’s the most economical for them, and the network has essentially no storage utility, no one is providing storage for the masses of retail users. The harsh penalty mechanism will also severely deplete the miner’s profits, because DDOS attacks are actually a very common attack technique for the attacker, and for a big miner, he can get a very high profit in a short period of time if he attacks other customers, and this thing is a profitable thing for all big miners. Now as far as the status quo is concerned, the vast majority of miners are actually not very well maintained, so they are not very well protected against these low-DDOS attacks. So the penalty regime is grim for them. The contradiction between the unreasonable system and the demand will inevitably lead to the evolution of the system in a more reasonable direction, so there will be many forked projects that are more reasonable in terms of mechanism, thus attracting Filecoin miners and a diversion of storage power. Since each project is in the field of decentralized storage track, the demand for miners is similar or even compatible with each other, so miners will tend to fork the projects with better economic benefits and business scenarios, so as to filter out the projects with real value on the ground. For the chaotic FIL price, because FIL is also a project that has gone through several years, carrying too many expectations, so it can only be said that the current situation has its own reasons for existence. As for the reasonable price of FIL there is no way to make a prediction because in the long run, it is necessary to consider the commercialization of the project to land and the value of the actual chain of data. In other words, we need to keep observing whether Filecoin will become a game of computing power or a real value carrier. Deep Chain Finance: Leo, we just mentioned that the pre-collateral issue of Filecoin caused the dissatisfaction of miners, and after Filecoin launches on the main website, the second round of space race test coins were directly turned into real coins, and the official selling of FIL hit the market phenomenon, so many miners said they were betrayed. What I want to know is, EpiK’s main motto is “save the miners eliminated by Filecoin”, how to deal with the various problems of Filecoin, and how will EpiK achieve “save”? Leo: Originally Filecoin’s tacit approval of the computing power makeup behavior was to declare that the official directly chose to abandon the small miners. And this test coin turned real coin also hurt the interests of the loyal big miners in one cut, we do not know why these low-level problems, we can only regret. EpiK didn’t do it to fork Filecoin, but because EpiK to build a shared knowledge graph ecology, had to integrate decentralized storage in, so the most hardcore Filecoin’s PoRep and PoSt decentralized verification technology was chosen. In order to ensure the quality of knowledge graph data, EpiK only allows community-voted field experts to upload data, so EpiK naturally prevents miners from making up computing power, and there is no reason for the data that has no value to take up such an expensive decentralized storage resource. With the inability to make up computing power, the difference between big miners and small miners is minimal when the amount of knowledge graph data is small. We can’t say that we can save the big miners, but we are definitely the optimal choice for the small miners who are currently in the market to be eliminated by Filecoin. Deep Chain Finance: Let me ask Eric: According to EpiK protocol, EpiK adopts the E2P model, which allows only experts in the field who are voted to upload their data. This is very different from Filecoin’s P2P model, which allows individuals to upload data as they wish. In your opinion, what are the advantages of the E2P model? If only voted experts can upload data, does that mean that the EpiK protocol is not available to everyone? Eric: First, let me explain the advantages of the E2P model over the P2P model. There are five roles in the DAO ecosystem: miner, coin holder, field expert, bounty hunter and gateway. These five roles allocate the EPKs generated every day when the main network is launched. The miner owns 75% of the EPKs, the field expert owns 9% of the EPKs, and the voting user shares 1% of the EPKs. The other 15% of the EPK will fluctuate based on the daily traffic to the network, and the 15% is partly a game between the miner and the field expert. The first describes the relationship between the two roles. The first group of field experts are selected by the Foundation, who cover different areas of knowledge (a wide range of knowledge here, including not only serious subjects, but also home, food, travel, etc.) This group of field experts can recommend the next group of field experts, and the recommended experts only need to get 100,000 EPK votes to become field experts. The field expert’s role is to submit high-quality data to the miner, who is responsible for encapsulating this data into blocks. Network activity is judged by the amount of EPKs pledged by the entire network for daily traffic (1 EPK = 10 MB/day), with a higher percentage indicating higher data demand, which requires the miner to increase bandwidth quality. If the data demand decreases, this requires field experts to provide higher quality data. This is similar to a library with more visitors needing more seats, i.e., paying the miner to upgrade the bandwidth. When there are fewer visitors, more money is needed to buy better quality books to attract visitors, i.e., money for bounty hunters and field experts to generate more quality knowledge graph data. The game between miners and field experts is the most important game in the ecosystem, unlike the game between the authorities and big miners in the Filecoin ecosystem. The game relationship between data producers and data storers and a more rational economic model will inevitably lead to an E2P model that generates stored on-chain data of much higher quality than the P2P model, and the quality of bandwidth for data access will be better than the P2P model, resulting in greater business value and better landing scenarios. I will then answer the question of whether this means that the EpiK protocol will not be universally accessible to all. The E2P model only qualifies the quality of the data generated and stored, not the roles in the ecosystem; on the contrary, with the introduction of the DAO model, the variety of roles introduced in the EpiK ecosystem (which includes the roles of ordinary people) is not limited. (Bounty hunters who can be competent in their tasks) gives roles and possibilities for how everyone can participate in the system in a more logical way. For example, a miner with computing power can provide storage, a person with a certain domain knowledge can apply to become an expert (this includes history, technology, travel, comics, food, etc.), and a person willing to mark and correct data can become a bounty hunter. The presence of various efficient support tools from the project owner will lower the barriers to entry for various roles, thus allowing different people to do their part in the system and together contribute to the ongoing generation of a high-quality decentralized knowledge graph. Deep Chain Finance: Leo, some time ago, EpiK released a white paper and an economy whitepaper, explaining the EpiK concept from the perspective of technology and economy model respectively. What I would like to ask is, what are the shortcomings of the current distributed storage projects, and how will EpiK protocol be improved? Leo: Distributed storage can easily be misunderstood as those of Ali’s OceanDB, but in the field of blockchain, we should focus on decentralized storage first. There is a big problem with the decentralized storage on the market now, which is “why not eat meat porridge”. How to understand it? Decentralized storage is cheaper than centralized storage because of its technical principle, and if it is, the centralized storage is too rubbish for comparison. What incentive does the average user have to spend more money on decentralized storage to store data? Is it safer? Existence miners can shut down at any time on decentralized storage by no means save a share of security in Ariadne and Amazon each. More private? There’s no difference between encrypted presence on decentralized storage and encrypted presence on Amazon. Faster? The 10,000 gigabytes of bandwidth in decentralized storage simply doesn’t compare to the fiber in a centralized server room. This is the root problem of the business model, no one is using it, no one is buying it, so what’s the big vision. The goal of EpiK is to guide all community participants in the co-construction and sharing of field knowledge graph data, which is the best way for robots to understand human knowledge, and the more knowledge graph data there is, the more knowledge a robot has, the more intelligent it is exponentially, i.e., EpiK uses decentralized storage technology. The value of exponentially growing data is captured with linearly growing hardware costs, and that’s where the buy-in for EPK comes in. Organized data is worth a lot more than organized hard drives, and there is a demand for EPK when robots have the need for intelligence. Deep Chain Finance: Let me ask Leo, how many forked projects does Filecoin have so far, roughly? Do you think there will be more or less waves of fork after the mainnet launches? Have the requirements of the miners at large changed when it comes to participation? Leo: We don’t have specific statistics, now that the main network launches, we feel that forking projects will increase, there are so many restricted miners in the market that they need to be organized efficiently. However, we currently see that most forked projects are simply modifying the parameters of Filecoin’s economy model, which is undesirable, and this level of modification can’t change the status quo of miners making up computing power, and the change to the market is just to make some of the big miners feel more comfortable digging up, which won’t help to promote the decentralized storage ecology to land. We need more reasonable landing scenarios so that idle mining resources can be turned into effective productivity, pitching a 100x coin instead of committing to one Fomo sentiment after another. Deep Chain Finance: How far along is the EpiK Protocol project, Eric? What other big moves are coming in the near future? Eric: The development of the EpiK Protocol is divided into 5 major phases. (a) Phase I testing of the network “Obelisk”. Phase II Main Network 1.0 “Rosetta”. Phase III Main Network 2.0 “Hammurabi”. (a) The Phase IV Enrichment Knowledge Mapping Toolkit. The fifth stage is to enrich the knowledge graph application ecology. Currently in the first phase of testing network “Obelisk”, anyone can sign up to participate in the test network pre-mining test to obtain ERC20 EPK tokens, after the mainnet exchange on a one-to-one basis. We have recently launched ERC20 EPK on Uniswap, you can buy and sell it freely on Uniswap or download our EpiK mobile wallet. In addition, we will soon launch the EpiK Bounty platform, and welcome all community members to do tasks together to build the EpiK community. At the same time, we are also pushing forward the centralized exchange for token listing. Users’ Questions User 1: Some KOLs said, Filecoin consumed its value in the next few years, so it will plunge, what do you think? Eric: First of all, the judgment of the market is to correspond to the cycle, not optimistic about the FIL first judgment to do is not optimistic about the economic model of the project, or not optimistic about the distributed storage track. First of all, we are very confident in the distributed storage track and will certainly face a process of growth and decline, so as to make a choice for a better project. Since the existing group of miners and the computing power already produced is fixed, and since EpiK miners and FIL miners are compatible, anytime miners will also make a choice for more promising and economically viable projects. Filecoin consumes the value of the next few years this time, so it will plunge. Regarding the market issues, the plunge is not a prediction, in the industry or to keep learning iteration and value judgment. Because up and down market sentiment is one aspect, there will be more very important factors. For example, the big washout in March this year, so it can only be said that it will slow down the development of the FIL community. But prices are indeed unpredictable. User2: Actually, in the end, if there are no applications and no one really uploads data, the market value will drop, so what are the landing applications of EpiK? Leo: The best and most direct application of EpiK’s knowledge graph is the question and answer system, which can be an intelligent legal advisor, an intelligent medical advisor, an intelligent chef, an intelligent tour guide, an intelligent game strategy, and so on.
AMA AT DETECTIVE ID (25/06/2020) Before welcoming any questions, I would like to briefly introduce STATERA PROJECT. Statera is a smart contract deflationary token pegged to a cryptocurrency index fund. By including STA in an index fund with Link, BTC, ETH, and SNX you can buy one token and access the price action of four of the leading cryptocurrencies. You can also invest directly in the index fund (balancer pool) and receive the benefits of fees and BAL tokens paid to you while also having an automatically balanced fund. Lastly the deflationary mechanics of STA increases the chance for positive price action while decreasing beta (volatility). This is all found in a smart contract that is fully decentralized, the founders can no longer augment the contract in any way and this has been confirmed by a third party code audit through Hacken. Q1 : please explain in more detail about Statera, what is the background of this project? and when was it established? The dev of this project had previously created another deflationary token BURN. When the Balancer Labs released the Balancer Protocol, he had an idea to combine the two, deflationary token and a pool of tokens, making the first deflationary index fund. It started in the end of May and on the 3rd iteration, May 29th - a trustless version was launched that we see today. As briefly explained earlier, STATERA or STA is an Index Deflationary Token built on Ethereum blockchain; Index: Contains a token suite of world class leading crypto assests BTC, ETH, LINK, SNX with STA. Deflationary: On every transaction of STA 1% of the transacted amount is sent to 0x address on ethereum, burned forever, thus reducing the circulating supply of STA Index+Deflationary: STA is mixed with BTC, ETH, LINK SNX in a portfolio, backed by liquidity on a protocol known as balancer (balancer.finance) This platform serves as a market maker for the token suit. The Index suite is of equal rate of 20%, that is 20% of BTC, ETH, SNX LINK and STA, Thus, anytime there is an increase in value of any of those coins or tokens, balancer automatically trade them for STA in order to keep the token suit ratio balanced. And anytime there is an increase in the value of STA, the same process applies. while doing this trade, it enables further burning on every transaction, thus facilitating more token scarcity. In addition to this, Statera was deployed with contract finalised, that is, the index suite can not be altered, It is completely out of Dev's control. Q2 : What are the achievements that have been obtained by Statera in 2020? And what goals do you want to achieve in 2020? By this we assume the questionnaire is asking for a roadmap! First, the project is barely a month old, and within just a month, our liquidity has grown from $50,000 to over $400,000 currently above $300,000. Among the things we have accomplished so far is the creation of market value for STA's Balancer liquidity pool token BPT, which is currently over $1000 per one BPT. Regarding what we set to achieve: The future is filled with many opportunities and potentials, currently, we are working on a massive campaign to introduce our product to the outside world. We have already made contact with different and reputable forums and channels regarding marketing and advertisement offers, some which we are currently negotiating, some which we are awaiting response. All we can say for now is that the Team is working hard to make this the Investment opportunity every crypto enthusiast has been waiting for. Statera has the goal of putting cryptocurrency into every portfolio. We believe we have a product that increases the returns of investing in cryptocurrencies and makes it easier to diversify in this space. We have done so much in June: articles, how to videos, completed the audit, tech upgrades like one token liquidity additions, and beginning our many social communities. We have been hard at work behind the scenes but things like sponsorships, features, and media take time, content makers need days if not weeks to develop content, especially the best of the best. We are working tirelessly, we will not disappoint. We have plans for 2020-2025 and will release those in the next month. They are big and bold, you’re going to be impressed by the scale of our vision, when we say “Cryptocurrency in every portfolio” we mean it. In 2020 more specifically we are focused on more media, videos, product offerings, and exchanges. Q3 : What is the purpose of STA token? How can we get STA? The purpose of STA is an investment in the first deflationary index fund. The whole index's value rises from these aspects: 1. The index funds (WBTC,WETH,SNX,LINK) appreciate in value 2. When the index tokens are traded, the pool receives transaction fees - 1% 3. STA burns on transactions, so it's deflationary nature increases its value as the total supply drops 4. Balancer rewards Index holders with BAL token airdrops every week You can invest via the 'Trade' links in stateraproject.com website. Easiest way is to do it using ETH. The monetary policy of our token is set in stone and constantly deflationary. This negative supply pressure is a powerful mechanism in economics and price discovery. Through the lowering of supply we can decrease your beta (volatility) and increase your alpha (gains). Our token is currently only top 40 in liquidity on Balancer, however our volume is top 10! You want to know why? Because Statera works. Statera increases arbitrage, volume, fees, BAL rewards, and liquidity. Our liquidity miners in our Balancer pool are already making some of the highest BAL rewards on the platform, one user we spoke with made 18% in June, that’s over 150% APY! Our product is working, 100% (or you could say 150%), and when people start to see that, and realize the value, the sky's the limit. Q4 : can we as a user do STA mining? The supply of STA doesn't increase anymore, it only decreases due to the burn feature. So there is no way to mine anymore STA. Only way to acquire the tokens is via an exchange. The monetary policy of our token is set in stone and constantly deflationary. This negative supply pressure is a powerful mechanism in economics and price discovery. Through the lowering of supply we can decrease your beta (volatility) and increase your alpha (gains). Our token is currently only top 40 in liquidity on Balancer, however our volume is top 10! You want to know why? Because Statera works. Statera increases arbitrage, volume, fees, BAL rewards, and liquidity. Our liquidity miners in our Balancer pool are already making some of the highest BAL rewards on the platform, one user we spoke with made 18% in June, that’s over 150% APY! Our product is working, 100% (or you could say 150%), and when people start to see that, and realize the value, the sky's the limit. Q5 : The ecosystem of a public chain has a lot to do with the level of engagement and participation of third-party developers. How does Statera support the developers? Not really. Our project is focusing on investment opportunities for the cryptocurrencies. The cryptocurrency tokens that are not used and are just sitting in a wallet can work for you by being added to an index fund and appreciate in value over time. First off, what we have created is a new asset class, I’ll repeat that, a new asset class. This asset has never existed: “Deflationary Index Fund,” what does that mean for finance? What will developers do with this? It’s hard to give a finite answer. We hope there are future economic papers on our token and what it means to be a deflationary index fund. With the addition of synthetic assets and oracles you can put any asset into the DeFi space: Gold, Nikkei 225, USD, etc. STA can be combined with any assets and bring the benefits of it’s ecosystem and deflationary mechanism to that asset. STA, the token itself, also gives you access to the price action of any asset it is paired with. Put simply STA’s balancer pool(s) give you a benefit in holding them, and STA’s price will reflect it’s inclusion in Balancer Pool(s) (and possibly future financial instruments), so STA is a bet on DeFi as a whole. When we say as whole, we mean as whole: what happens if you include STA in a crypto loan, or package it with a synthetic S&P 500 token, or use it as fee payment in a DeFi platform? Being fully decentralized it is up to our community to make this happen, social engagement and community are key. We are constantly bringing community members onto our team and rewarding those that benefit the ecosystem. in addition, Statera is a fully community project now. Paul who is the current team leader was an ordinary member of the community weeks ago, due to his interest and support for the project, he started dedicating his time to the project. Quite a number of community members are also in the same position, while Statera was developed by an individual, it is being built by the entire Statera community Community Questions (Twitter): Q1 From: @KazimKara35 The project tells us that the acquisition and sale of data between participants is protected by code of conduct and how safe is deployed on the blockchain, but how do you handle regulations while operating on a global scale? Statera is decentralized token, similar to other utility crypto tokens and same regulations apply to it as others. his is actually a benefit of our decentralized nature. This isn’t legal advice, however in the past regulating bodies have ruled that the more decentralized a project is, especially from launch, the less likely they are to be deemed a security (see: Ethereum). This means they can be traded more freely and be available on more platforms. We are as decentralized as you can be. The data itself is all secured through the blockchain which has been shown to be a highly secure medium. We do not store any of your data and as long as you follow best practices in blockchain security there are no added security risks of using Statera. We don’t, and literally can’t, hold anymore personal information than is made available in any blockchain transaction. and that "personal information" is more likely than not just your ethereum wallet address, no "real world" data is included in transactions Q2 from: @Michael_NGT353 What is Mechanism you use On your Project sir? Are you Use PoS,PoW or other Mechanism Can you explain why you use it and what is Make it Different? Our token is an ERC-20 token and it's running on the Ethereum blockchain. The Ethereum's POW mechanism is currently supporting the Statera token We run on Ethereum, so we are currently PoW. With ETH 2.0 we will hopefully be PoS this year (hopefully). We use it because ETH has over 100 million addresses and around a million daily transactions. We are currently at about 1,900 token holders, we are just touching the edge of what is possible in this market. We chose the biggest and the best network available right now to launch our product. We think the upside is huge because of this choice. Being the biggest network it is also one of the most secure, no high risk vulnerabilities have been found in Ethereum or in our code (we've had our code audited by a third party, Hacken, and you can read their audit on our Medium page), so we also have security on our side Q3 From : @Ryaaan_Nguyen Can you list some of Statera outstanding features for everyone here to know about? What are the products that Statera is focusing on developing? As mentioned earlier by GC, First off, what we have created is a new asset class, I’ll repeat that, a new asset class. This asset has never existed: “Deflationary Index Fund,” what does that mean for finance? What will developers do with this? It’s hard to give a finite answer. We hope there are future economic papers on our token and what it means to be a deflationary index fund. With the addition of synthetic assets and oracles you can put any asset into the DeFi space: Gold, Nikkei 225, USD, etc. STA can be combined with any assets and bring the benefits of it’s ecosystem and deflationary mechanism to that asset. STA, the token itself, also gives you access to the price action of any asset it is paired with. Put simply STA’s balancer pool(s) give you a benefit in holding them, and STA’s price will reflect it’s inclusion in Balancer Pool(s) (and possibly future financial instruments), so STA is a bet on DeFi as a whole. When we say as whole, we mean as whole: what happens if you include STA in a crypto loan, or package it with a synthetic S&P 500 token, or use it as fee payment in a DeFi platform? We touched on this a bit in the question on what makes us special compared to other exchanges. We have created a product that synergizes with Balancer Pools creating a symbiotic relationship that improves the outcomes for users (our product can also synergize with future DeFi products). By including STA in an index fund with Link, BTC, ETH, and SNX you can buy one token and access the price action of four of the leading cryptocurrencies. You can also invest directly in the index fund (balancer pool) and receive the benefits of fees and BAL tokens paid to you while also having an automatically balanced portfolio (like an index fund with dividends). Lastly, the deflationary mechanics of STA increases the chance for positive price action while decreasing beta. We want to package Statera with assets across the whole cryptocurrency space, with an emphasis on DeFi. We also want everyday people to be able to invest quickly in crypto while also feeling reassured their investment is set up to succeed. We are focused on developing a name brand that people go to first and foremost when investing in crypto: cryptocurrency in every portfolio. This is all found in a smart contract that is fully decentralized, the founders can no longer augment the contract in any way and this has been confirmed by the third party code audit. This is a feature in and of itself, some argue that Bitcoin’s true value is in it’s network effect, first mover advantage, and immutability. Statera is modeled on all three of those and has those features in spades. The community now owns our token, the power in that, giving finance and power to the people, is why we are here. Q4 From : @futcek What do you think about the possibility of creating new use cases in DeFi space for existing real world assets by using crypto technology? What role do you see in this creation for Statera? I think my answer above actually answers this perfectly, Statera in and of itself is a “new use case”, a “deflationary index fund” has never existed, I’ll copy and paste the other relevant part: “With the addition of synthetic assets and oracles you can put any asset into the DeFi space: Gold, Nikkei 225, USD, etc. STA can be combined with any assets and bring the benefits of it’s ecosystem and deflationary mechanism to that asset. STA, the token itself, also gives you access to the price action of any asset it is paired with. Put simply STA’s balancer pool(s) give you a benefit in holding them, and STA’s price will reflect it’s inclusion in Balancer Pool(s) (and possibly future financial instruments), so STA is a bet on DeFi as a whole. When we say as whole, we mean as whole: what happens if you include STA in a crypto loan, or package it with a synthetic S&P 500 token, or use it as fee payment in a DeFi platform? Being fully decentralized it is up to our community to make this happen, social engagement and community are key. We are constantly bringing community members onto our team and rewarding those that benefit the ecosystem.” Statera is a way to make your investment more successful, and owning Statera let's you benefit from other people using it to make their investments more successful (a self feeding cycle). Q5 From : @Carmenzamorag Statera's deflationary system is based in that with every transaction 1% of the amount is destroyed, would this lead to lack of supply and liquidity in the long term future? How would that be fixed? The curve of supply is asymptote, meaning that it will never reach zero. The idea is that the deflationary process will slowly decrease the supply of STA, which – combined with a fixed or increaseing demand – will result in STA appreciating in value. Evidently, as the STA token increases in value, the amounts of STA being traded will slowly decrease: The typical investor might buy 10.000 STA at the current rate, but in the future (proportional to an increase in the valueation of STA) this number will tend to decrease, hence the future investor might only buy 1000 STA. This of course results in less STA being burned. Additionally, STA is divisible to the 18th decimal, why – even if the supply was to reach 1 STA – there would be a sufficient supply. Well this would be a question for a Mathematician, and luckily we’re loaded with them (as seen above)! I’ll try to illustrate with an example. 1% of 100 million is 1 million, 1% of 10 million is 100,000. As we go down in supply the burn is less by volume. What also happens at lower supply is higher prices (supply and demand economics). So those 1 million tokens burned may be worth $20,000, but by the time overall supply is at 10 million those 100,000 tokens may also be worth $20,000 or even more. This means you transact “less”, if you want to buy 1 Ether now with Statera you need 8,900 STA which would burn 89 tokens. If Statera is worth $100 you only need 2.32 statera (.023 tokens burned). Along with this proportional and relative burn decrease, tokens are 18 decimals long, so even when we get to 1 token left (which mathematically would take decades if not centuries, but that is wholly dependent on usage), you are still left with 10 to the 18th power, or one quintillion “tokens”. So it’s going to take us a while to have supply issues :) Nuked Phase (3rd Part) Q) What is your VISION and Mission? Our working mission and vision: Mission: Provide every investor with simple and effective ways to invest in cryptocurrency. Decrease volatility and increase positive price pressure in cryptocurrency investments. Lower the barrier to entry for more advanced investment tools. Be a community focused and community driven cryptocurrency, fully decentralized by every meaning of the word. Vision: We aspire to put “cryptocurrency in every portfolio”. We envision a world where finance is given back to the people and wealth building strategies withheld only for affluent individuals are given to all. We also strive to create an investment environment based on sound monetary policy and all the power that comes with a sound asset. Q) What are the benefits of STA for its investors in long term? Does STA have Afrika as an important area for its expansion? We have ties to Africa and see Statera as a way for anyone and everyone to invest in cryptocurrency. The small marketcap of statera makes it's price low and it's upside massive. Right now if you wanted to be exposed to the price action of four cryptocurrencies (BTC, ETH, Link, SNX) Statera is a way to gain that exposure in a way that has a huge upside, compared to the other four assets, there are risks in investing in any small cap but with those risk come outsized rewards (not investment advice and all answers are solely my opinions 😊) Q) In the long run, why should we trust and follow STATERA? How do you raise awareness and elimination of the doubts of investors / partners / customers?. You're really asking "How do I trust myself and other crypto investors" The project is FULLY decentralized, it is now in the hands of the community. We would venture a guess that the community wants their investment to succeed and be worth more in the future, so you are betting on people. wanting to make themselves money on their own investment. This is a pretty sure bet. The community being active and engaged is key, and we have short term and long term plans to ensure this happens Q) No one can doubt the strength of #Statera. But can you tell us some of the challenges and difficulties you're presently facing? How can you possibly overcome them? We're swinging outside our weightclass, we don't see litecoin or SNX, or any other crypto product as our competition. Our competition is NASDAQ, Fidelity, etc. We want to provide world class financial instruments that only the wealthy have access to in the traditional world to everyone. Providing liquidity, risk parity, being paid to provide liquidity, unique value propositions, are all things we want to bring to everyone. However we are coming up in a hectic space, everyday their is fud and defamation on the web, but that is the sandbox we chose to play in and we aren't grabbing our ball and going home. We can tell you that we will not disappoint and fighting all the fud that comes along with being a small and upstart project only fuel our fire. Building legitimacy is our largest challenge and looking at our audit, financial report, and some things you will see in the coming weeks, we hope you see we are facing those challenges head on. Q) What is the actual uniqueness of #Statera.??? Can you guys please explain tha advantages of #Statera over other projects.?? When we launched there were no other products like ours. There are now copies, and we wish them the best, but we have the best product, hands down. Over the next couple weeks this will become apparent, if it hasn't already, also a lot of the AMA answers dug deeper into our unique value proposition, especially the benefits we provide to Balancer Pools which shows the benefits we would provide for any index fund. We are a tool to improve cryptocurrency investing Q) Fragmentation, layering and cross-chain are three future solutions for high-performance blockchains. Where is Statera currently? What are the main reasons for taking this direction? We operate on the Ethereum chain, as it upgrades our services and usability will upgrade. We are working on UI and more user friendly systems to onboard people into our ecosystem Q) How STATERA plan to make room and make this project known in the world of crypto, full of technology and full of new projects very good in today's market? We think we have a truly innovative product, which - when first understood - appeals to most investors. Whether you want a high-volatility/medium-risk token like STA or whether you are more conservative and simply just plan on adding to the Statera pool BPT (which is not nearly as volatile but still offers great returns). We plan on making Statera known to the crypto world through a marketing campaign which slowly will be unravelled in the comming days and weeks. If interested, you can check out an analysis of the different investment options in the Statera ecosystem in our first financial report: https://medium.com/@stateraproject/statera-financial-reports-b47defb58a18 Q) Hello, cryptocurrencies are very volatile and follow bitcoin ... and does this apply to Statera? or is there some other logic present in some way? is statera token different from a current token? Are you working on listings on other exchanges? Currently uniswap is somewhat uncomfortable for fees. We are also on bamboo relay, saturn network, and mesa. Statera will be volatile like all cryptocurrency, this is a small and nascent space. But with the deflationary mechanic and balancer pool, over time, as marketcap grows it will become less volatile and more positively reactive to price. Q) Security is one of the most essential characteristics for a project to get reputation. How can #Statera Team assure to their community that users assets and investments will stay safe from unwanted agents? We have been third party audited by the same company that worked with VeChain to audit their code. Our code has been shown to be bulletproof. Unless Ethereum comes up with a fatal security flaw there is nothing that can happen to our contract (there is no backdoor, no way for anyone to edit or adjust the smart contract). Q) Many investors see the project from the price of the coin. Can you give us advantages why Statera is so suitable for long-term investment? and what makes Statera different from other similar projects? Sometimes the simplest solutions are the most effective. A question you can ask is “What if this fails”? But you can also ask, “What if this succeeds”? Cryptocurrency is filled with asymmetric risks, we think if you look into the value proposition you will find that there is a huge asymmetric risk/reward in Statera, and we will make that even clearer in our soon to be released litepaper. You are on the ground floor of a simple but highly effective solution to onboarding people into defi, cryptocurrencies, and investing. Our product reduces volatility and increases gains (decreases beta and increases alpha in investor terms), which is highly attractive in any investment. The down side is there but the upside outweighs it exponentially (asymmetric risk) Q) What your plans in place for global expansion, are Statera focusing on only market at this time? Or focus on building and developing or getting customers and users, or partnerships? Can you explain this? We have reached out to influencers in other countries and things are in the works. We have also translated documents and are working on having them in at least 4 languages by the end of July. We were founded globally, our team is global, and we are focused on reaching all 7 billion people. Q) Now in the cryptofield everyday there are new projects joining in the Blockchain space. They are upgraded, Well-established and coming up with innovative technology. How Statera going to compete with them? What do you think, one day Statera will become useless And will be lost into the abyss of time for not bringing any new technology? We are the first of our kind, no one had a deflationary index fund before us. Index funds will be the future of crypto (look at the popularity of etfs and indexes in the traditional markets). We are a tool to make your index function better and pay you more. As long as people care about crypto index funds they will care about the value STA brings to that. We have an involved and long term plan to reach dominance over a 5 year span, this is not a flash in the pan, big things coming Q1. You say that the weight and proportions of your tokens are constant. So how have you managed to prevent market price speculation from generating hypervolability in your token price? Do you consider yourselves a kind of stablecoin? Q2. How many jurisdictions allow the use of Stratera products and services? Are they available for Latin America? @joloroeowo The balancer ensures an equal ratio of 20% amongst the five tokens included in our fund. This, however, does not imply that the tokens are stable. Rather, the Balancer protocol helps mitigating price fluctuations. Q) How can I as a Statera participant participate in liquidity mining, and receive BAL as reward? What are the use cases of $STA token, and how are users motivated to buy and hold long term? The easiest way is to go to stateratoken.com and click trade then BPT. You can also buy all five tokens and click on portfolio then add liquidity. Balancer is working on a simpler interface to add liquidity with one token, we are waiting on them. I think we explained the use cases above Q) What do you plan have for global expansion, is Statera currently focused solely on the market? Or is it focused on building and developing or acquiring customer and user or partnership relationships? Can you explain it? We are currently working on promoting the project and further develope our product, making it lucrative for more new investors to join our pool and invest in the STA token. Q1) Statera have 2 types of tokens, so can you tell me the differences between STA and STAC ? What are their uses cases? Is possible Swap between them? Q2) Currently the only possible Swap or "exchange" possible is Uniswap, so you do have plans to list the STA token into a more Exchanges? STAC is obsolete, we only have STA and BPT (go to our website and click on trade) stateratoken.com BPT gives you more diversification and less risk, STA gives you more volatility and more chance for big gains. Q2 we are on multiple exchanges (4), bamboo relay, saturn, and mesa we do have plans for future exchanges but the big ones have processes and hoops to jump through that can't be done so quickly Q) What business scenarios can STATERA support now? In which industries can we see the mass adoption of STATERA technology in the near future? Statera increases the effectiveness of your cryptocurrency investments. Specifically it makes cryptocurrency index funds function better, netting you higher returns, which we have already seen in just one month of implementation. Right now, today, you can buy our BPT token and increase the functionality of holding a crypto index fund. In the future we want every single web user to see and use our product Q) Do you plan to migrate to other platforms like Tron, BinanceChain, EOS, etc. if it is feasible?? Migrating our current contract is not. Starting new offerings on those other chains could be possible, they aren't on our radar currently but if the community requests them we are driven by our community Q) ETH Blockchain is a Blockchain have many token based in it, i have used ETH blockchain long time and i see it have big fee and need much time to make a transcation so Why you choose to based STA in ETH blockchain not other like Bep2 or Trc20 ? Simply: 100 million addresses, 1 million transactions a day. The more users we have the more we will benefit our community. We hope ETH 2.0 scaling will fix the problems you mention. Q) No one achieve anything of value on its own, please can you share about Statera present and future partnerships that will drive you to success in this highly congested crypto space? We have a unique product that no one else has (there are people who have copied us). We can't announce our current and future partnerships yet, but they will be released soon. Our future hopes of partnerships are big and will be key to our future, know we are focused on making big partnerships, some you may not even be thinking about. Q) According to the fact that your algorithm causes 1% of each transaction to be destroyed, I would like to know, then, how you plan to finance yourself as a project in the long term? The project is now in the hands of the community and we are a team of passionate people volunteering to help promote and develope the Statera ecosystem. But then, how do we afford running a promo campaign? We have lots of great community members donating funds that goes to promoting the project. In other words, the community helps financing the project. And so far, we have created a fantastic community consisting of passionate and well-educated people! Q) There are many cryptocurrency startups were established by talent teams, but they got problem in raising capital via token sales due to many factors as bear market, bankrupt etc. This leaded their potential startups fail. So how will Statera break these barriers and attract more funds from outside crypto space? We are community focused and community ran. When you look at centralized cryptocurrencies you can see the negative of them (Tron, ADA, etc.) We believe being fully decentralized is the true power position. You the owner of statera can affect our future and must affect our future. This direct ownership means people need to mobilize and organize to push us forward, and it is in their best self interest to do so. It's a bet on our community, we're excited about that bet Q) What business scenarios can STATERA support now? In which industries can we see the mass adoption of STATERA technology in the near future? Statera increases the effectiveness of your cryptocurrency investments. Specifically it makes cryptocurrency index funds function better, netting you higher returns, which we have already seen in just one month of implementation. Right now, today, you can buy our BPT token and increase the functionality of holding a crypto index fund. In the future we want every single web user to see and use our product Q) Why being a hybrid of a liquidity pool and an index fund? What are the main benefits about this? By being a liquidity pool the exchange side of the pool (balancer also functions as an exchange) gives you added liquidity for more effortless, effective, and cheaper rebalancing. You also benefit from getting paid the fee when people use the exchange AND getting paid BAL tokens that are worth $15-20 USD. These are not benefits you get with an index fund, meanwhile the liquidity pool rebalances just like an index fund would Q) Which specific about technology and strategy of #STA that make you believe it will be successful and what does #STA plan do to attract more users in the upcoming time? I think the idea behind Statera is truly ingenious. We have made an index fund, which investors are highly(!) incentivised to invest in, namely because the ROI, so far, has been huge. An increase in the pool liquidity (index fund) indirectly translates into an increase in the price of STA, why we think the STA token - combined with its deflationary nature - will increase in the long run. The mechanism behind this is somewhat complex, but to better get an understanding of it, I suggest you visit our medium page and read more about the project: https://medium.com/@stateraproject
Author: Gamals Ahmed, CoinEx Business Ambassador ABSTRACT The DFINITY blockchain computer provides a secure, performant and flexible consensus mechanism. At its core, DFINITY contains a decentralized randomness beacon, which acts as a verifiable random function (VRF) that produces a stream of outputs over time. The novel technique behind the beacon relies on the existence of a unique-deterministic, non-interactive, DKG-friendly threshold signatures scheme. The only known examples of such a scheme are pairing-based and derived from BLS. The DFINITY blockchain is layered on top of the DFINITY beacon and uses the beacon as its source of randomness for leader selection and leader ranking. A “weight” is attributed to a chain based on the ranks of the leaders who propose the blocks in the chain, and that weight is used to select between competing chains. The DFINITY blockchain is layered on top of the DFINITY beacon and uses the beacon as its source of randomness for leader selection and leader ranking blockchain is further hardened by a notarization process which dramatically improves the time to finality and eliminates the nothing-at-stake and selfish mining attacks. DFINITY consensus algorithm is made to scale through continuous quorum selections driven by the random beacon. In practice, DFINITY achieves block times of a few seconds and transaction finality after only two confirmations. The system gracefully handles temporary losses of network synchrony including network splits, while it is provably secure under synchrony.
DFINITY is building a new kind of public decentralized cloud computing resource. The company’s platform uses blockchain technology which is aimed at building a new kind of public decentralized cloud computing resource with unlimited capacity, performance and algorithmic governance shared by the world, with the capability to power autonomous self-updating software systems, enabling organizations to design and deploy custom-tailored cloud computing projects, thereby reducing enterprise IT system costs by 90%. DFINITY aims to explore new territory and prove that the blockchain opportunity is far broader and deeper than anyone has hitherto realized, unlocking the opportunity with powerful new crypto. Although a standalone project, DFINITY is not maximalist minded and is a great supporter of Ethereum. The DFINITY blockchain computer provides a secure, performant and flexible consensus mechanism. At its core, DFINITY contains a decentralized randomness beacon, which acts as a verifiable random function (VRF) that produces a stream of outputs over time. The novel technique behind the beacon relies on the existence of a unique-deterministic, non-interactive, DKG-friendly threshold signatures scheme. The only known examples of such a scheme are pairing-based and derived from BLS. DFINITY’s consensus mechanism has four layers: notary (provides fast finality guarantees to clients and external observers), blockchain (builds a blockchain from validated transactions via the Probabilistic Slot Protocol driven by the random beacon), random beacon (provides the source of randomness for all higher layers like smart contract applications), and identity (provides a registry of all clients). DFINITY’s consensus mechanism has four layers Figure1: DFINITY’s consensus mechanism layers 1. Identity layer: Active participants in the DFINITY Network are called clients. Where clients are registered with permanent identities under a pseudonym. Moreover, DFINITY supports open membership by providing a protocol for registering new clients by depositing a stake with an insurance period. This is the responsibility of the first layer. 2. Random Beacon layer: Provides the source of randomness (VRF) for all higher layers including ap- plications (smart contracts). The random beacon in the second layer is an unbiasable, verifiable random function (VRF) that is produced jointly by registered clients. Each random output of the VRF is unpredictable by anyone until just before it becomes avail- able to everyone. This is a key technology of the DFINITY system, which relies on a threshold signature scheme with the properties of uniqueness and non-interactivity. https://preview.redd.it/hkcf53ic05e51.jpg?width=441&format=pjpg&auto=webp&s=44d45c9602ee630705ce92902b8a8379201d8111 3. Blockchain layer: The third layer deploys the “probabilistic slot protocol” (PSP). This protocol ranks the clients for each height of the chain, in an order that is derived determin- istically from the unbiased output of the random beacon for that height. A weight is then assigned to block proposals based on the proposer’s rank such that blocks from clients at the top of the list receive a higher weight. Forks are resolved by giving favor to the “heaviest” chain in terms of accumulated block weight — quite sim- ilar to how traditional proof-of-work consensus is based on the highest accumulated amount of work. The first advantage of the PSP protocol is that the ranking is available instantaneously, which allows for a predictable, constant block time. The second advantage is that there is always a single highest-ranked client, which allows for a homogenous network bandwidth utilization. Instead, a race between clients would favor a usage in bursts. 4. Notarization layer: Provides fast finality guarantees to clients and external observers. DFINITY deploys the novel technique of block notarization in its fourth layer to speed up finality. A notarization is a threshold signature under a block created jointly by registered clients. Only notarized blocks can be included in a chain. Of all RSA-based alternatives exist but suffer from an impracticality of setting up the thresh- old keys without a trusted dealer. DFINITY achieves its high speed and short block times exactly because notarization is not full consensus. DFINITY does not suffer from selfish mining attack or a problem nothing at stake because the authentication step is impossible for the opponent to build and maintain a series of linked and trusted blocks in secret. DFINITY’s consensus is designed to operate on a network of millions of clients. To en- able scalability to this extent, the random beacon and notarization protocols are designed such as that they can be safely and efficiently delegated to a committee
1.1 OVERVIEW ABOUT DFINITY
DFINITY is a blockchain-based cloud-computing project that aims to develop an open, public network, referred to as the “internet computer,” to host the next generation of software and data. and it is a decentralized and non-proprietary network to run the next generation of mega-applications. It dubbed this public network “Cloud 3.0”. DFINITY is a third generation virtual blockchain network that sets out to function as an “intelligent decentralised cloud,”¹ strongly focused on delivering a viable corporate cloud solution. The DFINITY project is overseen, supported and promoted by DFINITY Stiftung a not-for-profit foundation based in Zug, Switzerland. DFINITY is a decentralized network design whose protocols generate a reliable “virtual blockchain computer” running on top of a peer-to-peer network upon which software can be installed and can operate in the tamperproof mode of smart contracts. DFINITY introduces algorithmic governance in the form of a “Blockchain Nervous System” that can protect users from attacks and help restart broken systems, dynamically optimize network security and efficiency, upgrade the protocol and mitigate misuse of the platform, for example by those wishing to run illegal or immoral systems. DFINITY is an Ethereum-compatible smart contract platform that is implementing some revolutionary ideas to address blockchain performance, scaling, and governance. Whereas DFINITY could pose a credible threat to Ethereum’s extinction, the project is pursuing a coevolutionary strategy by contributing funding and effort to Ethereum projects and freely offering their technology to Ethereum for adoption. DFINITY has labeled itself Ethereum’s “crazy sister” to express it’s close genetic resemblance to Ethereum, differentiated by its obsession with performance and neuron-inspired governance model. Dfinity raised $61 million from Andreesen Horowitz and Polychain Capital in a February 2018 funding round. At the time, Dfinity said it wanted to create an “internet computer” to cut the costs of running cloud-based business applications. A further $102 million funding round in August 2018 brought the project’s total funding to $195 million. In May 2018, Dfinity announced plans to distribute around $35 million worth of Dfinity tokens in an airdrop. It was part of the company’s plan to create a “Cloud 3.0.” Because of regulatory concerns, none of the tokens went to US residents. DFINITY be broadening and strengthening the EVM ecosystem by giving applications a choice of platforms with different characteristics. However, if DFINITY succeeds in delivering a fully EVM-compatible smart contract platform with higher transaction throughput, faster confirmation times, and governance mechanisms that can resolve public disputes without causing community splits, then it will represent a clearly superior choice for deploying new applications and, as its network effects grow, an attractive place to bring existing ones. Of course the challenge for DFINITY will be to deliver on these promises while meeting the security demands of a public chain with significant value at risk.
1.1.1 DFINITY FUTURE
DFINITY aims to explore new blockchain territory related to the original goals of the Ethereum project and is sometimes considered “Ethereum’s crazy sister.”
DFINITY is developing blockchain-based infrastructure to support a new style of the internet (akin to Ethereum’s “World Computer”), one in which the internet itself will support software applications and data rather than various cloud hosting providers.
The project suggests this reinvented software platform can simplify the development of new software systems, reduce the human capital needed to maintain and secure data, and preserve user data privacy.
Dfinity aims to reduce the costs of cloud services by creating a decentralized “internet computer” which may launch in 2020
Dfinity claims transactions on its network are finalized in 3–5 seconds, compared to 1 hour for Bitcoin and 10 minutes for Ethereum.
1.1.2 DFINITY’S VISION
DFINITY’s vision is its new internet infrastructure can support a wide variety of end-user and enterprise applications. Social media, messaging, search, storage, and peer-to-peer Internet interactions are all examples of functionalities that DFINITY plans to host atop its public Web 3.0 cloud-like computing resource. In order to provide the transaction and data capacity necessary to support this ambitious vision, DFINITY features a unique consensus model (dubbed Threshold Relay) and algorithmic governance via its Blockchain Nervous System (BNS) — sometimes also referred to as the Network Nervous System or NNS.
February 15, 2017 Ethereum based community seed round raises 4M Swiss francs (CHF) The DFINITY Stiftung, a not-for-profit foundation entity based in Zug, Switzerland, raised the round. The foundation held $10M of assets as of April 2017. February 8, 2018 Dfinity announces a $61M fundraising round led by Polychain Capital and Andreessen Horowitz The round $61M round led by Polychain Capital and Andreessen Horowitz, along with an DFINITY Ecosystem Venture Fund which will be used to support projects developing on the DFINITY platform, and an Ethereum based raise in 2017 brings the total funding for the project over $100 million. This is the first cryptocurrency token that Andressen Horowitz has invested in, led by Chris Dixon. August 2018 Dfinity raises a $102,000,000 venture round from Multicoin Capital, Village Global, Aspect Ventures, Andreessen Horowitz, Polychain Capital, Scalar Capital, Amino Capital and SV Angel. January 23, 2020 Dfinity launches an open source platform aimed at the social networking giants
Dfinity is building what it calls the internet computer, a decentralized technology spread across a network of independent data centers that allows software to run anywhere on the internet rather than in server farms that are increasingly controlled by large firms, such as Amazon Web Services or Google Cloud. This week Dfinity is releasing its software to third-party developers, who it hopes will start making the internet computer’s killer apps. It is planning a public release later this year. At its core, the DFINITY consensus mechanism is a variation of the Proof of Stake (PoS) model, but offers an alternative to traditional Proof of Work (PoW) and delegated PoS (dPoS) networks. Threshold Relay intends to strike a balance between inefficiencies of decentralized PoW blockchains (generally characterized by slow block times) and the less robust game theory involved in vote delegation (as seen in dPoS blockchains). In DFINITY, a committee of “miners” is randomly selected to add a new block to the chain. An individual miner’s probability of being elected to the committee proposing and computing the next block (or blocks) is proportional to the number of dfinities the miner has staked on the network. Further, a “weight” is attributed to a DFINITY chain based on the ranks of the miners who propose blocks in the chain, and that weight is used to choose between competing chains (i.e. resolve chain forks). A decentralized random beacon manages the random selection process of temporary block producers. This beacon is a Variable Random Function (VRF), which is a pseudo-random function that provides publicly verifiable proofs of its outputs’ correctness. A core component of the random beacon is the use of Boneh-Lynn-Shacham (BLS) signatures. By leveraging the BLS signature scheme, the DFINITY protocol ensures no actor in the network can determine the outcome of the next random assignment. Dfinity is introducing a new standard, which it calls the internet computer protocol (ICP). These new rules let developers move software around the internet as well as data. All software needs computers to run on, but with ICP the computers could be anywhere. Instead of running on a dedicated server in Google Cloud, for example, the software would have no fixed physical address, moving between servers owned by independent data centers around the world. “Conceptually, it’s kind of running everywhere,” says Dfinity engineering manager Stanley Jones. DFINITY also features a native programming language, called ActorScript (name may be subject to change), and a virtual machine for smart contract creation and execution. The new smart contract language is intended to simplify the management of application state for programmers via an orthogonal persistence environment (which means active programs are not required to retrieve or save their state). All ActorScript contracts are eventually compiled down to WebAssembly instructions so the DFINITY virtual machine layer can execute the logic of applications running on the network. The advantage of using the WebAssembly standard is that all major browsers support it and a variety of programming languages can compile down to Wasm (not just ActorScript). Dfinity is moving fast. Recently, Dfinity showed off a TikTok clone called CanCan. In January it demoed a LinkedIn-alike called LinkedUp. Neither app is being made public, but they make a convincing case that apps made for the internet computer can rival the real things.
2.1 DFINITY CORE APPLICATIONS
The DFINITY cloud has two core applications:
Enabling the re-engineering of business: DFINITY ambitiously aims to facilitate the re-engineering of mass-market services (such as Web Search, Ridesharing Services, Messaging Services, Social Media, Supply Chain, etc) into open source businesses that leverage autonomous software and decentralised governance systems to operate and update themselves more efficiently.
Enable the re-engineering of enterprise IT systems to reduce costs: DFINITY seeks to re-engineer enterprise IT systems to take advantage of the unique properties that blockchain computer networks provide.
At present, computation on blockchain-based computer networks is far more expensive than traditional, centralised solutions (Amazon Web Services, Microsoft Azure, Google Cloud Platform, etc). Despite increasing computational cost, DFINITY intends to lower net costs “by 90% or more” through reducing the human capital cost associated with sustaining and supporting these services. Whilst conceptually similar to Ethereum, DFINITY employs original and new cryptography methods and protocols (crypto:3) at the network level, in concert with AI and network-fuelled systemic governance (Blockchain Nervous System — BNS) to facilitate Corporate adoption. DFINITY recognises that different users value different properties and sees itself as more of a fully compatible extension of the Ethereum ecosystem rather than a competitor of the Ethereum network. In the future, DFINITY hopes that much of their “new crypto might be used within the Ethereum network and are also working hard on shared technology components.” As the DFINITY project develops over time, the DFINITY Stiftung foundation intends to steadily increase the BNS’ decision-making responsibilities over time, eventually resulting in the dissolution of its own involvement entirely, once the BNS is sufficiently sophisticated. DFINITY consensus mechanism is a heavily optimized proof of stake (PoS) model. It places a strong emphasis on transaction finality through implementing a Threshold Relay technique in conjunction with the BLS signature scheme and a notarization method to address many of the problems associated with PoS consensus.
2.2 THRESHOLD RELAY
As a public cloud computing resource, DFINITY targets business applications by substantially reducing cloud computing costs for IT systems. They aim to achieve this with a highly scalable and powerful network with potentially unlimited capacity. The DFINITY platform is chalk full of innovative designs and features like their Blockchain Nervous System (BNS) for algorithmic governance. One of the primary components of the platform is its novel Threshold Relay Consensus model from which randomness is produced, driving the other systems that the network depends on to operate effectively. The consensus system was first designed for a permissioned participation model but can be paired with any method of Sybil resistance for an open participation model. “The Threshold Relay is the mechanism by which Dfinity randomly samples replicas into groups, sets the groups (committees) up for threshold operation, chooses the current committee, and relays from one committee to the next is called the threshold relay.” Threshold Relay consists of four layers (As mentioned previously):
Notary layer, which provides fast finality guarantees to clients and external observers and eliminates nothing-at-stake and selfish mining attacks, providing Sybil attack resistance.
Blockchain layer that builds a blockchain from validated transactions via the Probabilistic Slot Protocol driven by the random beacon.
Random beacon, which as previously covered, provides the source of randomness for all higher layers like the blockchain layer smart contract applications.
Identity layer that provides a registry of all clients.
2.2.1 HOW DOES THRESHOLD RELAY WORK?
Threshold Relay produces an endogenous random beacon, and each new value defines random group(s) of clients that may independently try and form into a “threshold group”. The composition of each group is entirely random such that they can intersect and clients can be presented in multiple groups. In DFINITY, each group is comprised of 400 members. When a group is defined, the members attempt to set up a BLS threshold signature system using a distributed key generation protocol. If they are successful within some fixed number of blocks, they then register the public key (“identity”) created for their group on the global blockchain using a special transaction, such that it will become part of the set of active groups in a following “epoch”. The network begins at “genesis” with some number of predefined groups, one of which is nominated to create a signature on some default value. Such signatures are random values — if they were not then the group’s signatures on messages would be predictable and the threshold signature system insecure — and each random value produced thus is used to select a random successor group. This next group then signs the previous random value to produce a new random value and select another group, relaying between groups ad infinitum and producing a sequence of random values. In a cryptographic threshold signature system a group can produce a signature on a message upon the cooperation of some minimum threshold of its members, which is set to 51% in the DFINITY network. To produce the threshold signature, group members sign the message individually (here the preceding group’s threshold signature) creating individual “signature shares” that are then broadcast to other group members. The group threshold signature can be constructed upon combination of a sufficient threshold of signature shares. So for example, if the group size is 400, if the threshold is set at 201 any client that collects that many shares will be able to construct the group’s signature on the message. Other group members can validate each signature share, and any client using the group’s public key can validate the single group threshold signature produced by combining them. The magic of the BLS scheme is that it is “unique and deterministic” meaning that from whatever subset of group members the required number of signature shares are collected, the single threshold signature created is always the same and only a single correct value is possible. Consequently, the sequence of random values produced is entirely deterministic and unmanipulable, and signatures generated by relaying between groups produces a Verifiable Random Function, or VRF. Although the sequence of random values is pre-determined given some set of participating groups, each new random value can only be produced upon the minimal agreement of a threshold of the current group. Conversely, in order for relaying to stall because a random number was not produced, the number of correct processes must be below the threshold. Thresholds are configured so that this is extremely unlikely. For example, if the group size is set to 400, and the threshold is 201, 200 or more of the processes must become faulty to prevent production. If there are 10,000 processes in the network, of which 3,000 are faulty, the probability this will occur is less than 10e-17.
2.3 DFINITY TOKEN
The DFINITY blockchain also supports a native token, called dfinities (DFN), which perform multiple roles within the network, including:
Fuel for deploying and running smart contracts.
Security deposits (i.e. staking) that enable participation in the BNS governance system.
Security deposits that allow client software or private DFINITY cloud networks to connect to the public network.
Although dfinities will end up being assigned a value by the market, the DFINITY team does not intend for DFN to act as a currency. Instead, the project has envisioned PHI, a “next-generation” crypto-fiat scheme, to act as a stable medium of exchange within the DFINITY ecosystem. Neuron operators can earn Dfinities by participating in network-wide votes, which could be concerning protocol upgrades, a new economic policy, etc. DFN rewards for participating in the governance system are proportional to the number of tokens staked inside a neuron.
DFINITY is constantly developing with a structure that separates consensus, validation, and storage into separate layers. The storage layer is divided into multiple strings, each of which is responsible for processing transactions that occur in the fragment state. The verification layer is responsible for combining hashes of all fragments in a Merkle-like structure that results in a global state fractionation that is stored in blocks in the top-level chain.
2.5 DFINITY CONSENSUS ALGORITHM
The single most important aspect of the user experience is certainly the time required before a transaction becomes final. This is not solved by a short block time alone — Dfinity’s team also had to reduce the number of confirmations required to a small constant. DFINITY moreover had to provide a provably secure proof-of-stake algorithm that scales to millions of active participants without compromising any bit on decentralization. Dfinity soon realized that the key to scalability lay in having an unmanipulable source of randomness available. Hence they built a scalable decentralized random beacon, based on what they call the Threshold Relay technique, right into the foundation of the protocol. This strong foundation drives a scalable and fast consensus layer: On top of the beacon runs a blockchain which utilizes notarization by threshold groups to achieve near-instant finality. Details can be found in the overview paper that we are releasing today. The roots of the DFINITY consensus mechanism date back to 2014 when thair Chief Scientist, Dominic Williams, started to look for more efficient ways to drive large consensus networks. Since then, much research has gone into the protocol and it took several iterations to reach its current design. For any practical consensus system the difficulty lies in navigating the tight terrain that one is given between the boundaries imposed by theoretical impossibility-results and practical performance limitations. The first key milestone was the novel Threshold Relay technique for decentralized, deterministic randomness, which is made possible by certain unique characteristics of the BLS signature system. The next breakthrough was the notarization technique, which allows DFINITY consensus to solve the traditional problems that come with proof-of-stake systems. Getting the security proofs sound was the final step before publication. DFINITY consensus has made the proper trade-offs between the practical side (realistic threat models and security assumptions) and the theoretical side (provable security). Out came a flexible, tunable algorithm, which we expect will establish itself as the best performing proof-of-stake algorithm. In particular, having the built-in random beacon will prove to be indispensable when building out sharding and scalable validation techniques.
The startup has rather cheekily called this “an open version of LinkedIn,” the Microsoft-owned social network for professionals. Unlike LinkedIn, LinkedUp, which runs on any browser, is not owned or controlled by a corporate entity. LinkedUp is built on Dfinity’s so-called Internet Computer, its name for the platform it is building to distribute the next generation of software and open internet services. The software is hosted directly on the internet on a Switzerland-based independent data center, but in the concept of the Internet Computer, it could be hosted at your house or mine. The compute power to run the application LinkedUp, in this case — is coming not from Amazon AWS, Google Cloud or Microsoft Azure, but is instead based on the distributed architecture that Dfinity is building. Specifically, Dfinity notes that when enterprises and developers run their web apps and enterprise systems on the Internet Computer, the content is decentralized across a minimum of four or a maximum of an unlimited number of nodes in Dfinity’s global network of independent data centers. Dfinity is an open source for LinkedUp to developers for creating other types of open internet services on the architecture it has built. “Open Social Network for Professional Profiles” suggests that on Dfinity model one can create “Open WhatsApp”, “Open eBay”, “Open Salesforce” or “Open Facebook”. The tools include a Canister Software Developer Kit and a simple programming language called Motoko that is optimized for Dfinity’s Internet Computer. “The Internet Computer is conceived as an alternative to the $3.8 trillion legacy IT stack, and empowers the next generation of developers to build a new breed of tamper-proof enterprise software systems and open internet services. We are democratizing software development,” Williams said. “The Bronze release of the Internet Computer provides developers and enterprises a glimpse into the infinite possibilities of building on the Internet Computer — which also reflects the strength of the Dfinity team we have built so far.” Dfinity says its “Internet Computer Protocol” allows for a new type of software called autonomous software, which can guarantee permanent APIs that cannot be revoked. When all these open internet services (e.g. open versions of WhatsApp, Facebook, eBay, Salesforce, etc.) are combined with other open software and services it creates “mutual network effects” where everyone benefits. On 1 November, DFINITY has released 13 new public versions of the SDK, to our second major milestone [at WEF Davos] of demoing a decentralized web app called LinkedUp on the Internet Computer. Subsequent milestones towards the public launch of the Internet Computer will involve:
On boarding a global network of independent data centers.
Fully tested economic system.
Fully tested Network Nervous Systems for configuration and upgrades
2.7 WHAT IS MOTOKO?
Motoko is a new software language being developed by the DFINITY Foundation, with an accompanying SDK, that is designed to help the broadest possible audience of developers create reliable and maintainable websites, enterprise systems and internet services on the Internet Computer with ease. By developing the Motoko language, the DFINITY Foundation will ensure that a language that is highly optimized for the new environment is available. However, the Internet Computer can support any number of different software frameworks, and the DFINITY Foundation is also working on SDKs that support the Rust and C languages. Eventually, it is expected there will be many different SDKs that target the Internet Computer. Full article
Before we even begin to understand what bitcoin mining difficulty means, we need to know how mining works. We have covered this topic in detail before, so we will just give you a little overview before getting into the different nuances of difficulty. Following that, we will look at how mining difficulty is calculated and how it changes to suit the network’s needs. Let’s say you are mining at 50 TH/s and the Mining Pool sets your Share Difficulty at 1,000,000. You get credited by the pool for all shares that are above 1,000,000. You get credited by the ... Bitcoin’s mining difficulty just posted the smallest percentage change in 10 years. The Bitcoin network adjusted its difficulty level at 01:18 UTC on July 1 to 15.7842 trillion – down a mere 0 ... There are a lot of chats out there on the subject of cryptocurrency mining and trading. A typical question for any newbie would be something like, “which ASIC to buy or which mining rig to build,” and the response is: “Take a look at the growing difficulty of the network and don’t even think about getting into this business!” Solo Bitcoin Mining ... (difficulty setting). The mining hardware iterates through every possible value for the block header nonce and generates the corresponding hash. If none of the hashes are below the threshold, the mining hardware gets an updated block header with a new merkle root from the mining software; this new block header is created by adding extra nonce data to the coinbase field ...
What is Crypto Mining Difficulty and How it Impacts YOUR Profits - Explained W/ BTC ZenCash ZEC
As requested an overview of shares, difficulty and luck. Excuse my appearance as I am still under the weather a bit. More detailed vids to the series coming. Plotting Rig Build: ASRock X399 TAICHI ... Mining bitcoin is not as easy as it might seem. In this video, we highlight the biggest reason that most crypto miners end up losing money.---This episode is sponsored by Americas Cardroom, the ... This video is unavailable. Watch Queue Queue. Watch Queue Queue Genesis Mining was founded in 2013 Butterfly labs, bitcoin block, block erupter, difficulty, asic miner, Bitcoin, how much will bitcoin difficulty increase, bitcoin mining, what is bitcoin ... Like Reading About Bitcoin? - https://geni.us/eH9P Buy The Most Efficient GPU for Mining! - https://geni.us/Y3STM Mining Deal of the Day - https://geni.us/vSDc87y Join Red Panda Mining's Discord ...