GitHub - howardwu/IPFS-Ethereum-Storage: A simple ...

Infura

For all things Infura visit us on our new community site at https://community.infura.io
[link]

district0x

District0x is a network of decentralized markets and communities (districts). With district0x, anyone can create, operate, and govern networks of decentralized markets and communities. The district0x network is powered by Ethereum, Aragon, and IPFS.
[link]

Large is an open source peer-to-peer blogging platform that helps you build apps and websites that are hosted by the people that use them. Built on IPFS, OrbitDB, and Ethereum.

For the past year I've been working on a toolkit to make it easy to combine these different technologies to build applications that are hosted with IPFS. A few months ago I discovered OrbitDB which has really connected a lot of the dots.
Large is built on top of IPFS and Ethereum.
Right now I'm working on a Twitter-like prototype. Most of the basic functionality works but it's super rough around the edges. There are a whole lot of bugs still, but I've made a whole lot of progress over the past couple of months.
Over the next couple of weeks I'm going to be building out some code-along tutorials to show people how to use this to build real apps. All of the data is hosted on the machine where the app is running. Orbit allows you to load remote databases and send data to any peers you decide to connect to. In order to connect to a peer you just need their wallet public key.
I've been trying to separate the data services out enough so that you're not actually tied to the things that I prefer to use for the front-end.
https://gitlab.com/ptonelarge <- The prototype Twitter clone
https://gitlab.com/ptonelarge-core <- P2P data services
Documentation is still rough and some things are probably old and inaccurate. I'm writing + fixing it as fast as I can.
I love IPFS and want to make it easy to build stuff with it. I appreciate any feedback.
submitted by patricktoner to ipfs [link] [comments]

Breaking The Ice: A Crash Course In IPFS, Ethereum And Fat Protocols Of The Future - ELIX Blog

Breaking The Ice: A Crash Course In IPFS, Ethereum And Fat Protocols Of The Future - ELIX Blog submitted by elixirdev2 to BlogDiscovery [link] [comments]

[blog post] Results of the IPFS Ethereum Hackathon hosted by MetaMask!

[blog post] Results of the IPFS Ethereum Hackathon hosted by MetaMask! submitted by danfinlay to ethereum [link] [comments]

RESEARCH REPORT ABOUT THE GRAPH NETWORK

RESEARCH REPORT ABOUT THE GRAPH NETWORK
Author: Gamals Ahmed, CoinEx Business Ambassador

ABSTRACT

The Graph is a protocol for organizing blockchain data and making it easily accessible. It’s powering many of the most used applications in DeFi and the broader Web3 ecosystem today. Anyone can build and publish subgraphs, which are open APIs that applications can query with GraphQL. Subgraphs make it easy for developers to build on blockchains. What Google does for search, The Graph does for blockchains.
Currently, The Graph’s hosted service is processing over 4 billion monthly queries for applications like Uniswap, CoinGecko and Synthetix, for data like token prices, past trade volumes, and liquidity. However, The Graph’s mission is not to run a hosted service in perpetuity but to eliminate the possibility for APIs, servers and databases becoming single points of failure and control. This is why they are building The Graph Network to create an open marketplace of Indexers and Curators that work together to efficiently index and serve all the data for DeFi and Web3 in a decentralized way.

1.INTRODUCTION

Anyone who has ever tried to build distributed applications (dApps) on the (Ethereum) blockchain would concur: Although blockchains are conceptually quite close to databases, querying databases feels like a different world entirely compared to querying blockchains.
First off, there are notable performance issues with storing data on blockchains. These have a lot to do with the distributed nature of blockchains, and the penalty imposed by the combination of consensus protocols and cryptography.
Databases would be slow, too, if they were comprised of a network of nodes in which every node kept a full copy of the entire database, and every transaction had to be verified by every node. This is why people have been experimenting with various approaches to use blockchains as a database, including altering blockchain structure.
The Graph does something different: it lets blockchains be, but offers a way to index and query data stored on them efficiently using GraphQL.
QUERYING BLOCKCHAINS
Actually, performance is only part of the issue with retrieving data from blockchains. It gets worse: Blockchains have no query language to speak of. Imagine a database with no query language! How would you ever get what you need out of it? How do people build dApps, really? With a lot of effort, and brittle, ad-hoc code.
Blockchain data access is challenging mainly due to three fundamental reasons: Decentralization, Opacity, and Sequential Data Storage. So people are left with a few choices:
Writing custom code to locate the data they need on blockchains, and either repeating those (expensive) calls every time they need the data, or retrieving the data once and storing in an off-chain database, and building an index to point to the original blockchain data.
Why querying data on blockchains is hard. Image: Jesus Rodriguez
This is where The Graph comes in. The Graph is a decentralized protocol for indexing and querying blockchain data. But it’s more than just a protocol: The Graph also has an implementation, which is open source and uses GraphQL.
GraphQL is a query language for APIs, developed and open sourced by Facebook. GraphQL has taken a life of its own, it’s gaining in popularity and being used to access databases, too — see Prisma or FaunaDB, for example.
ZDNet had a Q&A with The Graph’s co-founders, project lead Yaniv Tal and research lead Brandon Ramirez.
In Tal’s words, right now, teams working on dApps have to write a ton of custom code and deploy proprietary indexing servers in order to efficiently serve applications. Because all of this code is custom there’s no way to verify that indexing was done correctly or outsource this computation to public infrastructure.
By defining a standardized way of doing this indexing and serving queries deterministically, Tal went on to add, developers will be able to run their indexing logic on public open infrastructure where security can be enforced.
The Graph have open sourced all their main components including: Graph Node (an implementation of an indexing node built in Rust), Graph TS (AssemblyScript helpers for building mappings), and Graph CLI (Command line tools for speeding up development).

1.1 OVERVIEW ABOUT THE GRAPH NETWORK

The Graph is a decentralized protocol for indexing and querying data from blockchains, starting with Ethereum. It makes it possible to query data that is difficult to query directly.
The Graph is a protocol for building decentralized applications (dApps) quickly on Ethereum and IPFS using GraphQL. The idea behind The Graph is to provide a way to query a blockchain in a simple yet fast manner.
The Graph includes a Graph Node, which is an application that processes the entire blockchain and allows subgraphs to be registered on it. These subgraphs define what contracts to listen to and how to process the data when events are triggered on the contracts.
The Graph Network decentralizes the query and API layer of Web3, removing a tradeoff dApp developers struggle with today: whether to build an application that is performant or to build an app that is truly decentralized.
Today, developers can run a Graph Node on their own infrastructure, or they can build on their hosted service. Developers build and deploy subgraphs, which describe how to ingest and index data from Web3 data sources. Many leading Ethereum projects have already built subgraphs including: Uniswap, ENS, DAOstack, Synthetix, Moloch, and more. In The Graph Network, any Indexer will be able to stake Graph Tokens (GRT) to participate in the network and earn fees as well as inflation rewards for serving queries.
Consumers will be able to use this growing set of Indexers by paying for their metered usage, proving a model where the laws of supply and demand sustain the services provided by the protocol.
Today, it can be easy to retrieve some information from a blockchain like an account’s balance or the status of a specific transaction. However, things become more complicated when we want to query specific information, such as a transaction list for an account of a particular contract. Sometimes the data persisted in a contract cannot be used directly for specific purposes, and transformations need to be done. Here is where The Graph and its subgraphs become really helpful.
The Graph Network is core infrastructure for Web3 — a necessary component for delivering decentralized applications with consumer-grade performance.
The Graph network will allow apps to be serverless — making them truly unstoppable since they’ll no longer rely on a single server or database but rather a network of nodes that are incentivized to keep the service running. The Graph Network also lets diverse, active participants earn income for providing data services rather than giving that power to data monopolies.
The Graph is transforming the existing data economy to one with better incentives, safer data sources, curated APIs and more expressive querying. The Graph Network will be launching later this year.
Quick Take:
  • The Graph, a San Francisco-based startup, has developed an indexing protocol that organizes all the information on the blockchain in an efficient way.
  • Many Ethereum applications are using the protocol to improve user experience.
  • The firm plans to use its latest funding to eliminate single points-of-failure.

1.1.1 FULL-STACK DECENTRALIZATION

The mission of The Graph is to enable internet applications that are entirely powered by public infrastructure.
Full-stack decentralization will enable applications that are resistant to business failures and rent seeking and also facilitate an unprecedented level of interoperability. Users and developers will be able to know that software they invest time and money into can’t suddenly disappear.
Today, most “decentralized” applications only adopt such a model in the bottom layer of the stack — the blockchain — where users pay for transactions that modify application state. The rest of the stack continues to be operated by centralized businesses and is subject to arbitrary failures and rent seeking.

1.1.2 THE GRAPH NETWORK ORIGINS

The cofounders — Jannis Pohlmann, Brandon Ramirez. They spent considerable time thinking about how to build software faster. They built frameworks, developer tools, and infrastructure to make application development more productive.
When they started diving into Ethereum in early 2017, it was apparent that the tooling and lack of mature protocols made it difficult to build dApps. The idea of making open data more accessible became an obsession of thier and The Graph was born.
They built the first prototype in late 2017. They spent months iterating on the design over whiteboard sessions, prototyping, and conversations with developers. They wanted to find a productive developer experience for writing indexing logic that could be securely operated on a decentralized network.

1.1.3 THE GRAPH, AN OPEN SOURCE PROTOCOL AND IMPLEMENTATION

As per Tal, the core of what The Graph have done? is to define a deterministic way of doing indexing. Graph Node defines a store abstraction that they implement using Postgres:
Everything you need to run a subgraph is open source. Right now, we use Postgres under the hood as the storage engine. Graph Node defines a store abstraction that we implement using Postgres and we reserve the right to change the underlying DB in the future. We’ve written a lot of code but it’s all open source so none of this is proprietary.” Tal said.
The subgraph that Tal refers to here is simply a part of the blockchain used to store data for specific dApps. Defining a subgraph is the first step to use The Graph. Subgraphs for popular protocols and dApps are in use already, and can be browsed using the Graph Explorer, which provides a user interface to execute GraphQL queries against specific smart contracts or dApps.
When The Graph was introduced in July 2018, Tal mentioned they would launch a local node, a hosted service, and then a fully decentralized network. The hybrid network is a version of the protocol design that bridges the gap between the hosted service, which is mostly centralized, and the fully decentralized protocol.
Users can run their own instance of The Graph, or they can use the hosted service. This inevitably leads to the question about the business model employed by The Graph, as running a hosted service costs money.

1.1.4 HOW THE GRAPH WORKS

The Graph learns what and how to index Ethereum data based on subgraph descriptions, known as the subgraph manifest. The subgraph description defines the smart contracts of interest for a subgraph, the events in those contracts to pay attention to, and how to map event data to data that The Graph will store in its database.
Once you have written a subgraph manifest, you use the Graph CLI to store the definition in IPFS and tell the hosted service to start indexing data for that subgraph.
This diagram gives more detail about the flow of data once a subgraph manifest has been deployed, dealing with Ethereum transactions:
https://preview.redd.it/fdmtke7uxzw51.jpg?width=923&format=pjpg&auto=webp&s=42d6eafe8edcd8e4521a6bd2921a124dd8514398
The Graph data flow (Image credit)
The flow follows these steps:
  1. A decentralized application adds data to Ethereum through a transaction on a smart contract.
  2. The smart contract emits one or more events while processing the transaction.
  3. Graph Node continually scans Ethereum for new blocks and the data for your subgraph they may contain.
  4. Graph Node finds Ethereum events for your subgraph in these blocks and runs the mapping handlers you provided. The mapping is a WASM module that creates or updates the data entities that Graph Node stores in response to Ethereum events.
  5. The decentralized application queries the Graph Node for data indexed from the blockchain, using the node’s GraphQL endpoint. The Graph Node in turn translates the GraphQL queries into queries for its underlying data store in order to fetch this data, making use of the store’s indexing capabilities.
  6. The decentralized application displays this data in a rich UI for end-users, which they use to issue new transactions on Ethereum.
  7. The cycle repeats.

1.1.5 FUNDING

The Graph has raised a total of $7.5M in funding over 4 rounds. Their latest funding was raised on Jun 30, 2020 from a Undisclosed round.The Graph is funded by 12 investors. AU21 Capital and Digital Currency Group are the most recent investors.

2. THE GRAPH NETWORK ARCHITECTURE

The Graph Network includes smart contracts that run on Ethereum combined with a variety of additional services and clients that operate off-chain.

2.1 QUERY MARKET

The query market serves a similar purpose to an API in a traditional cloud-based application — efficiently serving data required by a front end running on a user’s device. The key difference is that whereas a traditional API is operated by a single economic entity that users have no say over, the query market comprises a decentralized network of Indexers, all competing to provide the best service at the best price.
The typical flow interacting with the query market.
  • Service Discovery. The consumer asks The Graph which Indexers have the data they are interested in.
  • Indexer Selection. The consumer selects an Indexer to transact with based on which they deem most likely to provide the highest quality service at the best price.
  • Query + Conditional Micropayment. The consumer sends the Indexer a query along with a conditional micropayment that specifies how much they are willing to pay for compute and bandwidth.
  • Response + Attestation. If the Indexer accepts the price offered by the consumer, then they process the query and respond with the resulting data, as well as an attestation that this response is correct. Providing this attestation unlocks the conditional micropayment.
  • The attestation is produced deterministically and is uniquely attributable to the Indexer for the purposes of verification and dispute resolution elsewhere in the protocol.
  • A single decentralized application querying The Graph may use multiple subgraphs indexed by different Indexers and in that case would go through the above flow for each subgraph being queried.

2.2 PROTOCOL ROLES

These are the roles that interact with the system, the behaviors they must engage in for the protocol to function correctly. And what incentives motivate them?
  • Consumers. Consumers pay Indexers for queries. These will typically be end users but could also be web services or middleware that integrate with The Graph.
  • Indexers. Indexers are the node operators of The Graph. They are motivated by earning financial rewards.
  • Curators. Curators use GRT to signal what subgraphs are valuable to index. These will typically be developers but they could also be end users supporting a service they rely upon or a persona that is purely financially motivated.
  • Delegators. Delegators put GRT at stake on behalf of an Indexer in order to earn a portion of inflation rewards and fees, without having to personally run a Graph Node. They are financially motivated.
  • Fishermen. Fishermen secure the network by checking if query responses are accurate. Fishermen are altruistically motivated, and for that reason, The Graph will initially operate a fisherman service for the network.
  • Arbitrators. Arbitrators determine whether Indexers should be slashed or not during dispute resolution. They may be financially or altruistically motivated.

2.3 USES OF THE GRAPH PROTOCOL

1. For Developers
For developers, the APIs for building a subgraph will remain largely the same as it is using a local or hosted Graph Node.
One notable difference is in how developers deploy subgraphs. Rather than deploying to a local or hosted Graph Node, they will deploy their subgraph to a registry hosted on Ethereum and deposit a stake of GRT to curate that subgraph. This serves as a signal to Indexers that this subgraph should be indexed.
2. For End Users
For end users, the major difference is that rather than interacting with centralized APIs that are subsidized, they will need to begin paying to query a decentralized network of Indexers. This will be done via a query engine running on their machine — either in the browser, as an extension, or embedded in the dApp.
The query engine allows the user to safely query the vast amounts of data stored on The Graph without having to personally do the work to compute and store that data. The query engine also acts as a trading engine, making decisions such as which Indexers to do business with or how much to pay, based on the dApp being used or the user’s preferences.
For the query engine to provide a good user experience, it will need to automatically sign micropayment transactions on behalf of users rather than prompting them for every transaction that needs signing. We’re working with several state channel teams building on Ethereum to make sure that the wallets and functionality they ship meets the needs of metered usage protocols like The Graph. In the meantime, we will host a gateway that allows dApps to subsidize queries on behalf of users.
3. For Indexers
Indexers will be able to join The Graph by staking GRT and running a version of Graph Node.
They will also want to run an indexer agent that programmatically monitors their resource usage, sets prices, and decides which subgraphs to index. The indexer agent will be pluggable, and we expect that node operators will experiment with their own pricing models and strategies to gain a competitive edge in the marketplace over other Indexers.
  • For Curators and Delegators
Curators and delegators will curate and delegate via Graph Explorer. When we launch the network, Graph Explorer will be a fully decentralized application, and using it will require a dApp-enabled browser with an Ethereum wallet.
4. USING GRAPHQL WITH DAPPS
Now, GraphQL is popular, and it certainly beats having no query language at all. But there are also some popular misconceptions around it, and it’s good to be aware of them when considering The Graph, too. A significant part of GraphQL, added relatively recently, is its SDL (Schema Definition Language). This may enable tools to center the development process around a GraphQL schema.
Developers may create their domain model in SDL, and then use it not just to validate the JSON returned by GraphQL, but also to generate code, in MDD (Model Driven Development) fashion. In any case, using GraphQL does not “magically” remove the complexity of mapping across many APIs. It simply abstracts and transposes it to the GraphQL resolver.
So unless there is some kind of mapping automation/maintenance mechanism there, the team that uses the APIs abstracted via GraphQL may have a better experience, but this is at the expense of the team that maintains the API mappings. There’s no such thing as a free lunch, and the same applies for blockchains.
Even more so, in fact, as smart contracts cannot at this point be driven by GraphQL Schema. You first need to create a smart contract, then the GraphQL Schema and resolver for it. This makes for a brittle and tiresome round-trip to update schema and resolver each time the smart contract changes. Ramirez acknowledged this, and elaborated on the process of accessing smart contract data via GraphQL:
“The GraphQL schema is used to express a data model for the entities, which will be indexed as part of a subgraph. This is a read-schema, and is only exposed at layer two, not in the smart contracts themselves. Ethereum doesn’t have the semantics to express rich data models with entities and relationships, which is one reason that projects find querying Ethereum via The Graph particularly useful.
If a smart contract ABI changed in breaking ways, then this could require mappings to be updated if they were relying on the parts of the interface, but this isn’t a Graph specific problem, as any application or service fetching data directly from that smart contract would have similar problems. Generally making breaking changes to an API with real usage is a bad idea, and is very unlikely to happen in the smart contract world once shipped to production and widely used (defeats the purpose). Part of the “magic” of The Graph is that they auto-generate a “read schema” and resolvers
based on your data model. No need to maintain anything but the data model schema and the mappings, which shouldn’t need to change often. We’re also adding support for custom resolvers, however, for more advanced users.”

2.4 GRAPH TOKENS

To support the functioning of the query market, the protocol introduces a native token: Graph Tokens (GRT).
Graph Tokens have two primary uses in the protocol:
  • Indexer Staking. Indexers deposit Graph Tokens to be discoverable in the query market and to provide economic security for the work they are performing.
  • Curator Signaling. Curators deposit Graph Tokens in a curation market, where they are rewarded for correctly predicting which subgraphs will be valuable to the network.
Consumers will be able to pay for queries in ETH or DAI. Payments will be settled, however, in GRT to ensure a common unit of account across the protocol.
According to Ramirez, The Graph’s business (token) model is the work token model, which will kick off when they launch the hybrid network. Indexing Nodes, which have staked to index a particular dataset, will be discoverable in the data retrieval market for that dataset. Payment in tokens will be required to use various functions of the service.
The hosted service, Ramirez went on to add, ingests blocks from Ethereum, watches for “triggers,” and runs WASM mappings, which update the Postgres store. There are currently no correctness guarantees in the hosted service, as you must trust The Graph as a trusted party.
In the hybrid network there will be economic security guarantees that data is correct, and in the fully decentralized network, there will be cryptographic guarantees as well. The goal would be to transition everyone on the hosted service to the hybrid network once it launches, although Ramirez said they wouldn’t do this in a way that would disrupt existing users.

2.4.1 INDEXER STAKING

The Graph adopts a work token model, where Indexers must stake Graph Tokens in order to sell their services in the query market. This serves two primary functions.
  • It provides economic security, as the staked GRT can be slashed if Indexers perform their work maliciously. Once GRT is staked, it may only be withdrawn subject to a thawing period, which provides ample opportunity for verification and dispute resolution.
  • It provides a Sybil resistance mechanism. Having fake or low quality Indexers on a given subgraph makes it slower to find quality service providers. For this reason we only want Indexers who have skin in the game to be discoverable.
In order for the above mechanisms to function correctly, it’s important that Indexers are incentivized to hold GRT roughly in proportion to the amount of useful work they’re doing in the network.
A naive approach would be to try to make it so that each GRT staked entitles an Indexer to perform a specified amount of work on the network. There are two problems with this: first, it sets an arbitrary upper bound on the amount of work the network can perform; and second, it is nearly impossible to enforce in a way that is scalable, since it would require that all work be centrally coordinated on-chain.
A better approach has been pioneered by the team at 0x, and it involves collecting a protocol fee on all transactions in the protocol, and then rebating those fees to participants as a function of their proportional stake and proportional fees collected for the network, using the Cobb-Douglas production function.

2.4.2 CURATOR SIGNALING

For a consumer to query a subgraph, the subgraph must first be indexed — a process which can take hours or even days. If Indexers had to blindly guess which subgraphs they should index on the off-chance that they would earn query fees, the market would not be very efficient.
Curator signaling is the process of depositing GRT into a bonding curve for a subgraph to indicate to Indexers that the subgraph should be indexed.
Indexers can trust the signal because when curators deposit GRT into the bonding curve, they mint curation signal for the respective subgraph, entitling them to a portion of future query fees collected on that subgraph. A rationally self-interested curator should signal GRT toward subgraphs that they predict will generate fees for the network.
https://preview.redd.it/bnukd4fyxzw51.jpg?width=1200&format=pjpg&auto=webp&s=65a1188b037c7f09482ada00402fde1c31e93ab3
Using bonding curves — a type of algorithmic market maker where price is determined by a function — means that the more curation signal are minted, the higher the exchange rate between GRT and curation signal becomes. Thus, successful curators could take profits immediately if they feel that the value of future curation fees has been correctly priced in. Similarly, they should withdraw their GRT if they feel that the market has priced the value of curation signal too high.
This dynamic means that the amount of GRT signaled toward a subgraph should provide an ongoing and valuable market signal as to the market’s prediction for future query volume on a subgraph.

2.5 INDEXER INFLATION REWARD

Another mechanism they employ related to indexer staking and curator signaling is the indexer inflation reward.
This reward is intended to incentivize Indexers to index subgraphs that don’t yet have significant query volume. This helps to solve the bootstrapping problem for new subgraphs, which may not have pre-existing demand to attract Indexers.
The way it works is that each subgraph in the network is allotted a portion of the total network inflation reward, based on the proportional amount of total curation signal that subgraph has. That amount, in turn, is divided between all the Indexers staked on that subgraph proportional to their amount of contributed stake.

2.6 GRAPH EXPLORER AND GRAPH NAME SERVICE

Curating subgraphs for Indexers is only half of the story when it comes to surfacing valuable subgraphs. They also want to surface valuable subgraphs for developers.
This is one of the core value propositions of The Graph — to help developers find useful data to build on and make it effortless to incorporate data from a variety of underlying protocols and decentralized data sources into a single application.
Currently, developers accomplish this by navigating to Graph Explorer:
In The Graph Network, Graph Explorer will be a dApp, built on top of a subgraph that indexes the Graph Protocol smart contracts (meta, I know!) — including the Graph Name Service (GNS), an on-chain registry of subgraphs.
A subgraph is defined by a subgraph manifest, which is immutable and stored on IPFS. The immutability is important for having deterministic and reproducible queries for verification and dispute resolution. The GNS performs a much needed role by allowing teams to attach a name to a subgraph, which can then be used to point to consecutive immutable subgraph “versions.”
These human readable names, along with other metadata stored in the GNS, allows users of Graph Explorer to get a better sense for the purpose and possible utility of a subgraph in a way that a random string of alphanumeric characters and compiled WASM byte code does not.
In The Graph Network, discovering useful subgraphs will be even more important, as they will be shipping subgraph composition. Rather than simply letting dApps build on multiple separate subgraphs, subgraph composition will allow brand new subgraphs to be built that directly reference entities from existing subgraphs.
This reuse of the same subgraphs across many dApps and other subgraphs is one of the core efficiencies that The Graph unlocks. Compare this approach to the current state of the world where each new application deploys their own database and API servers, which often go underutilized.

2.7 INCENTIVES IN THE GRAPH NETWORK

GRT that is staked in the protocol is subject to a thawing period and can be slashed if Indexers are malicious and serve incorrect data to applications or if they index incorrectly. Curators and Delegators cannot be slashed for bad behavior, yet there is a withdrawal tax on Curators and Delegators to disincentivize poor decision making that could harm the integrity of the network. Curators also earn fewer query fees if they choose to curate on a low-quality
subgraph, since there will be fewer queries to process or fewer indexers to process those queries.

2.7.1 QUERY MARKETPLACE

Indexers that stake GRT operate in a query marketplace where they earn query fees for indexing services and serving queries to subgraphs — like serving Uniswap trade data on Uniswap.info. The price of these queries will be set by Indexers and vary based on cost to index the subgraph, the demand for queries, the amount of curation signal and the market rate for blockchain queries. Since Consumers (ie. applications) are paying for queries, the aggregate cost is expected to be much lower than the costs of running a server and database.
A Gateway can be used to allow consumers to connect to the network and to facilitate payments. The team behind The Graph will initially run a set of gateways that allows applications to cover the query costs on behalf of their users. These gateways facilitate connecting to The Graph Network. Anyone will be able to run their own gateways as well. Gateways handle state channel logistics for query fees, and route to Indexers as a function of price, performance and security that is predetermined by the application paying for those queries.

2.7.2 INDEXING REWARDS

In addition to query fees, Indexers and Delegators will earn indexing rewards in the form of GRT that is new token issuance distributed proportional to Curator signal and allocated stake. Indexing rewards will start at 3% annually. Future GRT monetary policy will be set by an independent technical governance which will be established as we approach network launch.
The Graph Network will have epochs which are measured in blocks and are used for the Indexing Rewards calculations.

2.7.3 COBBS-DOUGLAS PRODUCTION FUNCTION

In addition to query fees and indexing rewards, there is a Rebate Pool that rewards all network participants based on their contributions to The Graph Network. The rebate pool is designed to encourage Indexers to allocate stake in rough proportion to the amount of query fees they earn for the network.
A portion of query fees contributed to the Rebate Pool are distributed as rebate rewards using the Cobbs-Douglas Production Function, a function of contributions to the pool and their allocation of stake on a subgraph where the query fees were generated. This reward function has the property that when Indexers allocate stake in proportion to their share of contribution of fees to the rebate pool, they will receive back exactly 100% of their contributed fees back as a rebate. This is also the optimal allocation.

2.7.4 PROTOCOL SINKS & BURNS

A portion of protocol query fees are burned, expected to start at ~1% of total protocol query fees and subject to future technical governance. The aforementioned withdrawal tax that is incurred by Curators and Delegators withdrawing their GRT is also burned, as well as any unclaimed rebate rewards.

2.7.5 DELEGATION PARAMETERS

Each Indexer specifies how Delegators are rewarded based on the following two delegation parameters:
  • Reward cut — The % of indexing rewards that the Indexer keeps.
  • Fee cut — The % of query fees that the Indexer keeps.
Indexers accept delegated stake according to a delegation capacity, which is a multiple of their own contributed stake. This ratio between Indexer and Delegator stake will be set through technical governance.

2.8 CONDITIONAL MICROPAYMENTS

Payment channels is a technology that has been developed for scalable, off-chain, trust-minimized payments. It involves two parties locking funds on-chain in an escrow where the funds may only be used to exchange funds off-chain between them until a transaction is submitted on-chain to withdraw funds from the escrow.
Traditionally, payment channel designs have emphasized securely sending a micropayment off-chain without regard for whether or not the service or good being paid for was actually received.
There has been some work, however, toward atomic swaps of micropayments for some digital good or outsourced computation, which the graph’s team build on here. They call their construction WAVE Locks. WAVE stands for work, attestation, verification, expiration, and the general design is as follows:
1. Work. A consumer sends a locked micropayment with a description of the work to be performed. This specification of the work acts as the lock on the micropayment.
2. Attestation. A service provider responds with the digital good or service being requested along with a signed attestation that the work was performed correctly.
3. Verification. The attestation is verified using some method of verification. There may be penalties, such as slashing, for attesting to work which was incorrectly performed.
4. Expiration. The service provider must either receive a confirmation of receipt from the consumer or submit their attestation on-chain to receive their micropayment before the locked micropayment expires.

2.9 VERIFICATION

In order for the WAVE Locks construction and indexer staking to be meaningful, there must be an effective verification mechanism that is capable of reproducing the work performed by an Indexer, identifying faults and slashing offending Indexers.
In the first phase of The Graph Network, this is handled through an on-chain dispute resolution process, which is decided through arbitration.
Fishermen submit disputes along with a bond, as well as an attestation signed by an Indexer. If the Indexer is found to have attested to an incorrect query response, the fisherman
receives a portion of the slashed amount as a reward. Conversely, the fisherman’s bond is forfeit if the dispute is unsuccessful.
Importantly, the fisherman’s reward must be less than the slashed amount. Otherwise, malicious Indexers could simply slash themselves to get around thawing periods or avoid slashing by someone else.
In the long run, as the network becomes more reliable, the Graph’s team would expect the reward to active fishermen to dwindle to near zero. Thus, even though there is a fisherman’s reward, they consider this actor to be motivated by altruistic incentives.
For that reason, initially, there will be a fisherman service where consumers may post attestations, and they will take on the responsibility of verifying query responses and submitting disputes on-chain. Of course, anyone who wishes may also perform this role.
Additionally, in the early days of the network, there will be an arbitration service set via protocol governance, which will act as the sole arbitrator in the dispute resolution. This allows the team to exercise judgment when incorrect queries may arise because of bugs in the software, Indexers missing events from the blockchain, or other accidental factors that could lead to a slashable offense.
Eventually, as the software matures, Indexers will be expected to develop the operational expertise to avoid these sorts of errors.

2.10 THE FUTURE WORK

The crypto economy is a radical new imagining of the future of work. Open protocols will create transparency and opportunity, enabling anyone in the world to contribute their talents to a global economy. The Graph want to support this vision and help developers build the new coordination mechanisms of the internet age.
Future work on The Graph Network involves exploring new market mechanisms and parameterization of existing mechanisms, which will make the query market more dynamic and efficient. The latter will involve running agent-based and dynamic simulations on the existing mechanism design, as well as analyzing the network after launch.
The contracts will be upgradeable so the protocol can continue to be improved after launch.
In the longer term, the Graph would like to eliminate the roles of fisherman and arbitrator altogether, by relying on authenticated data structures, consensus and cryptographic proofs.

3. THE GRAPH PROTOCOL COMMUNITY

Website: https://thegraph.com/
Twitter: 14.6k followers https://twitter.com/graphprotocol
Medium: https://medium.com/graphprotocol
Telegram: 8.9k subscribers https://t.me/graphprotocol
Reddit: 278 members https://www.reddit.com/thegraph/
Github: https://github.com/graphprotocol

4. REFERENCES

  1. https://thegraph.com/blog/the-graph-grt-token-economics
  2. https://www.crunchbase.com/organization/the-graph/company_financials
  3. https://www.theblockcrypto.com/daily/72790/ethereum-devs-the-graph
  4. https://thegraph.com/blog/the-graph-network-in-depth-part-1
  5. https://thegraph.com/blog/the-graph-network-in-depth-part-2
  6. https://www.zdnet.com/article/the-graph-an-open-source-query-protocol-for-blockchains-using-graphql/
  7. https://medium.com/graphprotocol
  8. https://thegraph.com/docs/introduction#next-steps
  9. https://www.tokendaily.co/blog/on-the-value-of-decentralized-querying-
submitted by CoinEx_Institution to Coinex [link] [comments]

The Decentralized Storage War: Filecoin vs. Arweave

The Decentralized Storage War: Filecoin vs. Arweave
Over the past decade, a number of businesses have come up with ingenious ways to put empty properties and other “idle” assets to better use. Airbnb shocked the hospitality industry by making money off of otherwise empty homes and bedrooms. Uber upended the taxi industry by using otherwise idle cars as taxis. Building on this theme, decentralized storage networks aim to disrupt the cloud storage industry by increasing the use of otherwise unused computer storage.
Cloud storage is a market worthy of challenge. Over the last decade, the new cloud paradigm has displaced the legacy on-premise servers, birthing new tech behemoths like Amazon Web Services (AWS), Alibaba Cloud, Microsoft Azure, Google Cloud Platform (GCP) and many others. Today, market research firms estimate that the global market for cloud storage will reach $137.3 billion by 2025 while growing 22.3% per year. This market is enormous.
Recognizing this opportunity, many teams began building open source solutions in late 2016 and early 2017 including Filecoin, Storj, Sia and SAFE. In 2018, Arweave launched a mainnet based on similar ideas around decentralized storage, but with a different purpose: permanent storage. Unlike the other decentralized storage networks—which aim to compete with legacy cloud providers on cost and/or performance—Arweave uses permissionless crypto-economic incentives to create a new kind of service that wasn’t possible before.
Filecoin mining is booming in China, and FIL is now trading at around $28, implying a fully diluted network valuation of $56 billion [at time of writing, Oct. 21]. The mainnet just launched, the market is excited and hype is at an all time high. Meanwhile, Arweave recently raised capital from prominent investors including Andreessen Horowitz, Union Square Ventures and Coinbase Ventures. In April 2020, Arweave 2.0 launched, beating Filecoin to market, and it has been growing consistently since.
As investors evaluate the forthcoming Storage Wars, big questions loom. Which network will developers choose, if either? Can decentralized storage networks compete on cost against Web2 giants like Amazon and Alibaba? How should investors compare the two approaches to decentralized storage?
In this essay, we explore these questions and provide a framework to evaluate the decentralized storage market. This essay is not investment advice, but rather a framework for evaluating one of the most exciting markets in Web3. With that backdrop, the first step is understanding the differences and trade-offs between Filecoin and Arweave.

Contract-Based vs. Permanent Storage

Both Filecoin and Arweave enable decentralized, trust-minimized, censorship-resistant data storage. Both are built using blockchain technology. And both networks can be used to store data for long periods of time, either for archival purposes or for real time applications like website hosting. At first glance, they are quite similar. So rather than start by evaluating Filecoin’s and Arweave’s respective blockchains, a better place to begin is how they intend to offer storage to end users.
Filecoin's economic model mirrors that of centralized cloud providers: contract-based storage. Contract-based storage can be more simply thought of as a pay-as-you-go model. Users pay a network of nodes that store X bytes of data for Y period of time with Z retrievability guarantees. Storj, Sia and SAFE use the same model.
Arweave, on the other hand, introduces an entirely new economic model to the market, one that was never possible before the advent of permissionless crypto networks: permanent storage. With permanent storage, users pay a one-time, up-front fee to store the data forever. Permanent storage creates an entirely new market (we’ll get to this later). The Arweave protocol accomplishes this by leveraging crypto-economic game theory and creating an endowment to compensate miners for ensuring data availability, reliability and permanence.

The Filecoin Proposition

Filecoin and other contract-based decentralized storage protocols (Sia, Storj) primarily compete on cost. They claim to be able to offer lower costs than centralized providers because they utilize otherwise idle hard drive space. These networks also offer a higher degree of censorship resistance than traditional cloud storage providers.
Given Filecoin’s clout, it’s reasonable to expect that it will capture some share of the Web3 storage market. However, in practice, it’s unlikely that it will be able to sustainably undercut Amazon’s pricing. Filecoin will effectively subsidize costs for storage buyers in the early days with FIL issuance, but (1) printing tokens to subsidize costs cannot last forever without adversely affecting token price, (2) Amazon, Alibaba, Tencent, Microsoft and Google have a lot more capital that they can use to subsidize prices if it comes down to a price war, (3) Amazon cross-subsidizes their S3 business lines such as storage, compute, database, etc.
In every one of Amazon’s business units, they vertically integrate and radically slash costs. For example, Amazon has steadily increased its footprint in the shipping value stack, and today Amazon ships upwards of 50% of its own packages in the U.S. (whereas just a few years ago Amazon relied on third parties for 100% of shipments). They take a similar approach in cloud services, offering everything that developers need at massive scale and low cost. Storage customers benefit from these economies of scale (as per the chart below), but take on platform risk.
Source: Thomas Vachon
How does Amazon accomplish this? They can negotiate bulk purchase agreements that allow them to access cheaper storage and power. Per a study called Cost to Support Compute Capacity conducted by Ponemon Institute LLC, the average annual cost/kW of power ranges from $5,467 for data centers over 50,000 square feet to $26,495 for data centers between 500-5,000 square feet. Amazon also purchases hardware in large bulk, which gives them access to cheaper parts that they then pass on to cloud customers.
Today, it’s challenging for an individual with idle storage space to participate on the supply side of the Filecoin network (the recommended hardware is 128-256 GB of RAM and a 24 Core CPU). As such, the network at launch is dominated by miners who have invested large amounts of capital in powerful mining machines. We know from Space Race, Filecoin’s testnet, that at least 145 of 230 PiB of storage was provided by the top 10 miners. It is unlikely that these medium-sized mining facilities will have lower data center costs than Amazon, which runs some of the largest data centers in the world.
Based on hardware configuration, Filecoin miners cannot outcompete traditional cloud storage providers on cost. Thus, the primary “feature” of Filecoin is not cost, but rather ideology and a greater degree of censorship resistance.

The Arweave Proposition

The Arweave Protocol provides permanent storage as a service. It does so not by creating contracts between users and storage providers, but by creating crypto-economic incentives for miners to replicate as much data as possible. Permanent data storage is an entirely new service that Amazon, Google and others cannot offer.
To store a file on Arweave, a developer creates a transaction that pays some amount of AR tokens as a network fee (currently $5/GB) to store data forever. In comparison, Amazon S3 charges $0.276/GB per year for their low tier pricing, implying that Arweave is 18x as expensive as Amazon. Arweave does not claim to compete with Amazon on cost. Arweave users are explicitly paying a premium for something that Amazon cannot offer—permanent storage.
We have identified two market segments that are adopting Arweave now. Players in these two sectors are less price sensitive and need permanent storage:
  1. Blockchains for data availability - Blockchains are meant to store the history of a transaction network forever. Arweave enables Layer 1 and Layer 2 teams to store a copy of their ledger permanently. This is key for auditability and redundancy. Several prominent teams such as Solana and SKALE are finalizing their integrations with Arweave for this purpose now, and we expect more chains to announce similar initiatives in the coming months.
  2. Internet Archiving - The Internet Archive recently announced that they will leverage Arweave to fulfill their mission; they are the non-profit organization that hosts the famous Wayback Machine project, which hosts old websites even after their original creators take them down. While it’s not widely known, “link rot” is a massive problem. For example, over 49% of links cited in U.S. Supreme Court decisions are broken, according to research in 2013. The internet is exhausting data at a compounding rate. As more data is created, more links break.
While we do not know how large the market for permanent storage is, we do know that popular blockchains will need to store massive amounts of data once they achieve web scale. We also know that humanity creates a massive amount of data, and there are organizations that actively seek out ways to store humanity’s history in fault tolerant ways.
There are several other markets that we can reasonably expect to value permanence of data storage, and be willing to pay a premium for it:
  1. Journalists who want to make sure their reporting is available forever to shine light on the truth;
  2. Political dissidents who want to ensure that governments can’t censor their thoughts;
  3. Lawyers working on personal estates or trusts;
  4. NGOs or foundations who want to store their records forever;
  5. People who want to store personal memories for distant future generations.
Lastly, and most importantly, we expect Arweave will enable the creation of new kinds of unstoppable applications that rely on permanent, immutable storage. The Arweave team has been fostering the development of this nascent ecosystem for years. At the Fall 2019 Multicoin Summit, Arweave founder Sam Williams gave a presentation outlining how the Permaweb will reshape the web as we know it. Today, Arweave is now in a similar stage to where Ethereum was in early 2016. There are dozens of developers building new kinds of Arweave-based apps including: ArDrive, Limestone, Evermore, Nest.land, Non-Zone, ArGo, Outpost, OpenBits, Verto, WeaveID and more.
Source: ViewBlock

Relative Network Valuations and Token Economics

Filecoin’s native token, FIL, has two functions:
  1. Miners must stake FIL as collateral in exchange for the ability to host files. They must stake 0.1901 FIL per 32 GiB of data that can be converted into storage mining power.
  2. FIL is used as a medium of exchange.
Payment tokens should be valued using the MV=PQ equation. For a simple model, we can use Filecoin’s own estimate of a $75 billion market size by 2021, and the USD M1 velocity of 3.9 (which is generous considering FIL is subject to the velocity problem). We use a velocity of 3.9 because there is a velocity sink in which storage nodes must stake FIL tokens to earn FIL-denominated rewards. Using a velocity of 3.9, were Filecoin to capture the entire market for cloud storage over Amazon and Alibaba, the FIL market cap would rationally be worth $19 billion ($75 billion/3.9) — 66% lower than where it trades today ($28 per FIL or a $56 billion fully diluted network valuation, at time of writing).
Given this analysis, it’s easy to see how Filecoin’s implied fully diluted network value is far ahead of its fundamentals.
On the other hand, Arweave’s native token, AR, is about $2.69, implying a fully diluted valuation of $178 million. It’s important to note that AR tokens are subject to a powerful token velocity sink (more so than FIL’s circular payment structure in which storage nodes escrow FIL tokens to receive FIL tokens). When a user pays to store data on the Arweave network, they don’t just pay the miner the storage fee, which the miner can then sell for USD. Instead, more than 83% of the fee goes into an endowment pool. The endowment pool is slowly paid out to miners over time for storing data. For each file, this endowment pool slowly approaches 0 over time, but never actually reaches 0. This is possible because the cost of storage falls over time. Over the last 50 years, the cost of storage has decreased at about 41% per year. Therefore, as demand for permanent storage grows, users buy AR and then lock it in the endowment, creating continued buying pressure. We believe AR’s economic model captures value much more effectively than FIL.
Or said another way, Arweave—as a fully-launched, fully-functioning network with novel token economics—is valued at .31% of Filecoin today. The chart below demonstrates the discrepancy. Filecoin is the red line. Arweave is the blue line.
Source: CoinMarketCap

The Storage Wars

It’s impossible to say if either of these networks and their valuations will grow or shrink, if Filecoin will achieve its vision of disrupting AWS and Alibaba or if Arweave will still be around in 20 years to serve files that users pay to store today.
What is known is that Filecoin has a thriving community and a lot of early investors who participated in its $257 million ICO in 2017. It’s also known that Filecoin has been under development for the past three years and that Juan Benet) is an absolutely amazing engineer. The Filecoin team has built both the Interplanetary File System (IPFS) and CoinList. IPFS, which is designed to work with Filecoin, demonstrates the team’s competence in creating large-scale digital infrastructure, while CoinList showcases their ability to create well-functioning user-facing applications. Both of these characteristics will be paramount for success.
On the other hand, Arweave is interesting because it serves an entirely new market, one that is uniquely enabled by blockchain technology, and one that caters to both the existing storage market and unlocks new ones. The network is live and is growing. Decentralized application interfaces from projects like SushiSwap, Uniswap V2 and yearn.finance recently adopted Arweave, paving the way for thousands of applications to follow. This will all ultimately work to accelerate Arweave’s growth.
In summary, after examining (1) the technical and economic designs of both Filecoin and Arweave, (2) the present-day dynamics in the cloud storage markets, (3) the fact that Arweave unlocks a new market and differentiates on features beyond price and (4) that AR is valued at about .31% of FIL, we believe there is a compelling relative value play here.
Disclosure: Multicoin has established, maintains and enforces written policies and procedures reasonably designed to identify and effectively manage conflicts of interest related to its investment activities. Multicoin Capital abides by a “No Trade Policy” for the assets listed in this report for 3 days (“No Trade Period”) following its public release. At the time of publication, Multicoin Capital holds Arweave ($AR) tokens.
Authors: Spencer Applebaum, Tushar Jain
Source: https://coinmarketcap.com/alexandria/article/the-decentralized-storage-war-filecoin-vs-arweave
submitted by mr_sonic to CryptoCurrency [link] [comments]

Dapp Solutions Great Reddit Scaling Bake-Off Submission

Dapp Solutions Great Reddit Scaling Bake-Off Submission
DAPP Solutions Great Reddit Scaling Bake-Off Proposal
Github : https://github.com/DAPP-Solutions/redditEthDappSol
LiquidApps/DAPP Documentation: https://docs.liquidapps.io/en/v2.0
EOSIO Documentation: https://github.com/EOSIO/eos
DappSolutions Telegram: https://t.me/DAPPSolutionsCommunity
About DAPP Solutions:
DAPP Solutions is a full cycle development ecosystem and DAPP Service Provider (DSP) on the Liquidapps DAPP Network, a universal middleware of powerful services for modern decentralized applications.
https://dappsolutions.app/
Our Team:
Jason Kemp, CEO
John Campbell, CTO
Ami Heines, CIO
Arunima Ray, Developer
Prasanjit Dey, Developer
Submission Tools:
Ethereum (ERC-20 Token model) https://github.com/ethereum
EOSIO (Network Resources) https://github.com/EOSIO/eos
LiquidLink (IBC) https://liquidapps.io/liquid-link
LiquidOracle (Web-interaction/IBC) https://liquidapps.io/liquid-oracles
LiquidScheduler (CRON) https://liquidapps.io/liquid-scheduler
vRAM (Memory) https://liquidapps.io/vRam
The Goal
Short Term Goal:
Reddit/Ethereum Scalability and Resource Efficiency via Blockchain Interoperability
Team DAPP Solutions would like to start by saluting Team Reddit for leading the charge in mass adoption of blockchain technologies. We will show our appreciation by developing a Reddit Community Points solution that is verifiably inexpensive, scalable and secure.
Long Term Goal:
Blockchain Agnosticism | One Network
It is a sincere pleasure for us at Team DAPP Solutions to be engaging directly with the Reddit and Ethereum communities. Those of us familiar and engaged with the LiquidApps DAPP Network have waited patiently to showcase our unique and differentiated services towards a positive sum gain for all.
Interoperability | InterBlockchain Communication (IBC)
On July 26, 2019, CryptoTwitter was treated to a demonstration by the LiquidApps Team of blockchain interoperability using a DAPP Network service called LiquidLink.
Link here:
https://twitter.com/LiquidAppsIO/status/1154842918705926145?s=20
Our slogan at DAPP Solutions since our inception is “Onboard Everyone”, so we were very excited last July to watch blockchain interoperability come to fruition between the Ethereum and EOSIO-based blockchains.
The DAPP Solutions Team is excited to be able to introduce Reddit and the greater Ethereum community to the LiquidLink bridge between Ethereum and EOSIO. This DAPP Network service leverages the benefits and communities of both technologies in moving towards the greater goal of interoperable, scaleable and decentralized blockchain solutions.
Submission Timeline Explained:
As you review our proposal, please understand that we entered this challenge on July 11, following the LiquidApps July, 6 code upgrade to LiquidLink, and accompanying Medium article,(https://medium.com/the-liquidapps-blog/the-dapp-networks-reddit-scaling-bounty-d60e057de6d) specifically upgraded to take on the challenge of the Great Reddit Scaling Bake-Off.
The article echos our sentiments at DAPP Solutions:
“Despite getting the ball rolling on Ethereum scalability, Reddit isn’t the only one that can benefit from the DAPP Network’s unique cross-chain middleware. Any Ethereum project that wishes to scale without leaving its native ecosystem, could utilize such a mechanism to go where no dApp has gone before — mass usage.”
With limited time and resources available to us, we chose to focus on interoperability, scalability and resource efficiency with the intention of integrating with one or more existing wallets by the time submissions are reviewed and chosen.
Now with an upgraded version of LiquidLink, we set out to show our blockchain brethren what our tools can do. Our solution relies on the EOSIO infrastructure, as well as a L2 solution called the DAPP network to scale the EOSIO network’s capabilities and resource management.
Our goal in providing this POC is to demonstrate how the DAPP Network can act as a live, advanced middleware to scale the Reddit Ethereum model while ensuring resource and cost efficiency.
Demos Should Include:
  1. A live proof of concept showing hundreds of thousands of transactions. (Pending)
  2. Source Code (See GitHub link above)
  3. Documentation:
  4. How it Works and Scales:
  • Ethereum for distributing/minting tokens in ERC-20 format when users want to remove them to popular Ethereum wallets, as well as token burning when users send funds back to the account for re-entry into the EOSIO/DAPP side of the system.
  • EOSIO storage/computation to manage user balances, community distributions, purchased community features and other community functions that feed into the user balance system upon a successful round of user distribution (via karma/voting as the system is currently implemented on reddit).
  • The DAPP network for its ability to call Ethereum networks through a service called LiquidLink, which allows us to make calls to ethereum at regular intervals to update the latest IPFS hash tip. This lets us store the latest version of the memory implementation so that any DSP running the LiquidApps contract logic will be able to rebuild and continue to run the system.
  • The DAPP network for vRAM which uses IPFS to scale the storage potential of an EOSIO contract, making the upper-limit and resource usage of the contract as minimal as possible. This storage object has a hash for the latest entry which can be used when storing on ethereum.
  • The DAPP network LiquidOracle for Oracle services, in order to determine if user balances have been submitted back into the system (burned on the Ethereum contract and re-assigned to the user in EOSIO storage).
  • The DAPP network for LiquidScheduler, a cron-job library for calling ethereum at regular intervals to update the IPFS hash and to execute anything needed on the ethereum network.
  1. Cost Estimates (on-chain and off-chain):
Comparable metrics will be posted to the demo page, which will be linked from the github. (pending)
EOSIO requires that system tokens be assigned to the contract (EOS on the EOS mainnet) for CPU, RAM, and NET. Both CPU and NET usage limits are returned to the account (linear timeline to return of resources), which is similar to the DAPP token in that once these tokens are staked, you can increase your stake in a given resource when needed, however they can be returned/un-staked and sold - so while there is an initial setup cost, the ongoing costs will be greatly reduced, and largely predictable.
Anyone can add resources to the contract when usage increases, these resources can all be unstaked/sold if usage decreases, or if a new contract is introduced to manage the system etc. RAM on EOSIO is bought and sold on an internal market, and does have an upper-limit.
However, the vRAM solution through the DAPP network will allow us to remove this upper-limit, and convert it into a quota of X number of calls for Y number of DAPP tokens, which can be modified accordingly without requiring more (or in some cases, a minor amount of) EOSIO RAM resources.
  1. How to Run It
Using our reddit simulator, and our demo page (pending), a web UI will be used to trigger the simulation while tweaking variables for efficiency.
  1. Architecture
  • An Ethereum smart contract (existing SubReddit token contract) to be called for minting, distributing, and burning tokens per community, as well as storing the latest IPFS hash from the DSP.
  • An EOSIO/DAPP smart contract which manages the community points system. This contract is interpreted first by a DSP (with consensus across multiple DSPs if needed for consensus checks to decentralize trust in a single DSP), then passed along to EOSIO for on-chain processing.
    • User balances and subscriptions/purchases/sending of tokens between users.
    • Community descriptions e.g. symbol, ETH contract address, and any other info needed (such as URLs for icons and other cosmetic community variables)
    • Management of community distributions to users upon a submitted list.
    • Calling the ethereum contract to mint/issue tokens to users that want to exit the system to an existing wallet.
    • Calling the ethereum contract to update the latest IPFS hash for recreating the data (in case of any issue at the DSP level, the data inside the contract can be re-created from this hash).
  • A node.js/postgres reddit simulation engine to generate user accounts with ethereum addresses, issue them karma, then distribute tokens according to karma rankings (recipient list approval not included). It is currently acting as our user ethereum wallets for testing purposes, and otherwise seeks only to replicate and randomize reddit usage in relation to the community point system.
  1. API’s (on chain & off) : Ver 1.0 N/A We have built a Reddit Community Point Simulation
    1. End-User Case
Dependent on wallet collaboration/integration for API to connect to reddit.
  1. Known Issues:
Pending:
  • A demo including an end-user interface for signing transactions with their ethereum-based wallet.
    • Code attached to submission for how to use same private key for both EOSIO and Ethereum accounts, easily implementable for any existing ethereum wallet to support calling both EOSIO and Ethereum-based chains using a shared private key across networks.
  • To link ethereum point burning back to user balances in the system.
    • Also a plan for integration using the DAPP network’s LiquidOracles implementation to check what tokens have been burned, and from what address, so that the EOSIO/DAPP model can update the user account’s balances to reflect changes in subscriptions and purchase of community features.
  • Comparative metric outputs (and easily able to be run in a random manner in regards to user accounts and community interactions to ensure consistency, which is already in the simulation code).
  • Functional demo page to be completed - linked on github.
Requirements
Scaling:
Our PoC (redditdapp.cpp) and test contract (main.cpp) currently tests and has consumed all necessary services to meet the requirements:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We are in the process of setting up our live demo page to show the system in action, which we will put onto the github repo in the coming days, so be sure to check back!
Decentralization
DSPs are chosen by the EOSIO/DAPP contract owner (which would be reddit in this submission’s case), however many DSPs can be selected by that contract and utilized to create consensus amongst calls, as well as to decentralize the trust desired/required when working with them across many parties.Through a combination of these services, we can decentralize every part of the system.
Useability:
While we still require a wallet to integrate with our solution, our Reddit Simulation Server is mimicking wallets for the sake of demonstration. Outside of this implementation, the system self-manages assuming there are enough resources (both for gas fees on ETH and for account/network resources on EOSIO/DAPP).
Interoperability:
Using LiquidLink and LiquidOracles, we can both sign transactions on, and listen to the Ethereum blockchain as shown in linked examples/articles above.
Security:
All systems require private keys to interact with the contracts, and otherwise will only run on a scheduler. So long as private keys are only held for account resource management
Resource Efficiency
Through the usage of vRAM and EOSIO accounts, there are no TX fees outside of moving tokens onto/off of the Ethereum network.
The vRAM/LiquidLink/Oracles/Scheduler model requires that DAPP tokens be staked to a DAPP Service Provider (DSP), which must increase according to usage, but can be un-staked at any time.
Strengths of this approach
Cost/Resource Efficiency
Without the need for transaction fees at every interaction in the system on the
Ethereum network, we should be able to bring a majority of system costs to a minimum. None of the fees inside of the EOSIO/DAPP system are “spent” in the sense of transaction fees, and are refundable in the form of EOSIO system and DAPP tokens. These networks provide a much higher throughput at a much lower cost. The only time gas fees would need to be spent would be the update interval for the IPFS hash (sped up for simulation purposes, but this interval can be set to any amount of time), then token minting/issuing/burning gas fees should a user decide to exit or enter the system using an ERC-20 wallet. This implementation is also easily port-able to EOSIO tokens, which would speed up the system and remove transaction fees from the system entirely.
Flow of operations:
Community Perspective
  1. Community (reddit systems, not users) registers a subreddit with system (EOSIO) and loads the ERC-20 token contract onto an ethereum address for that community point/token (ETH) and to mint an initial supply for the ETH contract.
  2. Reddit submits token distribution per-month/round and includes user info, as well as community/token amount to the EOSIO contract (EOSIO), which then uses vRAM (DAPP) to query and store the information.
  3. LiquidScheduler runs on interval to check what events are queued to go out to Ethereum upon being triggered (DAPP)
  4. If any users have asked to exit the system (EOSIO), the queue is run through the LiquidLink service (DAPP) to send a signature to ethereum to send tokens to their ETH accounts (ETH receive).
  5. To re-enter the system, users can send funds back to the community contract to be burned (ETH).
  6. During the LiquidScheduler interval check, it will call the LiquidOracle service (DAPP) to check for sends/token burns.
  7. If any burns are found, it will match the ETH address to the user account and add the funds back into their account (EOSIO/DAPP).
User Perspective (Dependent on cross-chain key signing wallet support)
  1. User receives community points from distribution rounds. (Reddit -> EOSIO)
  2. User can choose to subscribe to a community, or buy community features with tokens or fiat.
    1. If fiat, ETH contract will be called using LiquidLink to burn tokens (DAPP -> ETH)
    2. If purchasing with tokens, vRAM records will be updated to deduct from the user’s balance without adding them elsewhere. (DAPP)
      1. These actions would be signed using an integration of the keycode.js file (included in github “wallet” folder)
  3. Can send tokens from one user to another, using a wallet implementation of keycode.js to verify before moving balances (EOSIO and potentially ETH depending on final wallet solution).
  4. Can select a set of community points to exit the system into their wallet (Wallet -> DAPP -> ETH)
  5. Can send tokens to the community’s ETH address to re-enter the system, no burn needed from ETH in this case, so long as the initial token supply is minted on Ethereum (ETH)
  6. LiquidOracle Service will read ethereum network to check for received tokens to update vRAM for user balance. (DAPP)

https://preview.redd.it/wzs7firyybe51.png?width=1626&format=png&auto=webp&s=cfe8200c7450cb5385b2bc1638dd750cf2cb3a58
https://preview.redd.it/tlweyqi0zbe51.png?width=1626&format=png&auto=webp&s=846081b6356204d7231ad641960ad2c66ca1c3d3
https://preview.redd.it/6pbu4po1zbe51.png?width=1626&format=png&auto=webp&s=e07b59290812a5ad630e50693910d0f5d6090ecc
https://preview.redd.it/q41riha3zbe51.png?width=1626&format=png&auto=webp&s=f2f0803a7a6a602bff314ba932223d8e482d6caf
The DAPP Solutions Team welcomes your feedback, technical questions and fair criticism.
We’ll be available via this thread or on our telegram at https://t.me/DAPPSolutionsCommunity
We humbly thank you for this opportunity.
Onward and Upward!
submitted by Gilser to ethereum [link] [comments]

[self-promotion] Insurance data and analysis curation bounty

Hi Statisticians,
We are inviting you to participate in an insurance-related curation of data and data+models. Each successful submission of data(only) has an equivalent 300USD per successful submission up to 4. While the reward for data+model submission has an equivalent of 100USD per successful submission up to 3.
Here is the listing policy for your 'data only' to be accepted: https://ipfs.kleros.io/ipfs/Qmf7odAHKuyS2NyQcV16oPPficNuESJwEU7gqeJXo2mu4E/primary-document-for-data-submissions.pdf
Here is the listing policy for your 'data+model' to be accepted: https://ipfs.kleros.io/ipfs/QmdTcyKVc8dfa2hi2sj5ChFupUe3ypmFhHwnxAt6jK6BQA/primary-document-for-model-submissions-1-.pdf
I'll be glad to assist you in submitting your entries that need a little knowledge on Metamask, Send/Receive of Ethereum, and Curate dApp.
You may also challenge a submission should you find it not within the rules of listing acceptance. The bounty ranges from 80USD-150USD plus .07 ETH for a successful challenge."
Original announcement: https://blog.kleros.io/kleros-as-a-tool-for-open-innovation/
submitted by btcph to datasets [link] [comments]

CAP (Collateralized Asset Protocol) Update *Beta Launched + Roadmap*

As stated in my last post the Beta is currently being tested by the telegram community!
https://www.coingecko.com/en/coins/cap

[Text from the roadmap released today - read here at https://blog.cap.finance/2020/09/02/the-roadmap.html]
"This post lays out a 10 year roadmap for Cap. Our goal is to build open, decentralized financial services with CAP as the native token.
We consider this to be such an important mission because of its huge disruptive potential. For the first time in history, we have the opportunity to build unstoppable financial services governed by fair and open rules, as opposed to the current centralized and opaque financial system that underpins almost every aspect of our society.
Here’s what we plan to do to achieve that vision.

Perpetuals

The Cap ecosystem starts with Cap Perpetuals, which we launched in August.
Perpetuals let you open leveraged long or short positions on certain markets using stablecoins such as DAI. Profits are backed by the Cap Liquidity Pool (CLP), which in turn receives trader losses. Yield generated by the CLP is used to buy back CAP.
The current iteration of Perpetuals is semi-decentralized. Things like price feeds and liquidations are managed by our server to ensure speed and predictability of execution. We believe that a semi-decentralized form of the Perpetuals product will be desirable for the foreseeable future, particularly for traders who are speed-sensitive.
A fully decentralized Perpetuals design is possible and planned. Price feeds are provided by independent oracles while liquidations are monitored and managed by anyone willing to do so in exchange for a fee. This design is not trivial and issues like oracle front-running, which can result in riskless profit, as well as informed trading flow, which can drain the CLP, will need to be dealt with accordingly.
Perpetuals generate sustainable revenue that allows us to go after higher risk markets as described below.

Synthetics

Synthetics are crypto-backed instruments that track the value of an underlying asset, like a stock. They let you gain exposure to an asset without needing to own it, and without interacting with the traditional financial system, saving time and money.
Cap Synthetics are non-leveraged, withdrawable ERC-20 tokens that you can purchase using stablecoins. They are backed by the CLP and their price is determined by oracle feeds.
To purchase a synthetic, you send an amount of stablecoins to our Synthetics contract. The contract determines the price of the asset you’re trying to buy based on its oracle feed and mints the equivalent amount of tokens to your address.
To sell a synthetic, you send your asset tokens to our Synthetics contract, which determines the amount of stablecoins to send to you based on the oracle feed. The tokens you send are then burned.
We plan to release a proof of concept for our Synthetics product some time in September. The goal is end-to-end decentralization. The backend logic is executed entirely in smart contracts and client side code is hosted on IPFS, accessible through an unstoppable .crypto domain.

Exchange

A token exchange will be part of the Cap ecosystem. It will be decentralized and function similarly to Uniswap, with a few differences, including a more efficient market maker based on our technical whitepaper.
In short, Cap Exchange will have a market maker that will adapt its liquidity curve based on the type of assets being exchanged. For example, a volatile asset pair would have a more sensitive spread than a non volatile one, resulting in a superior experience for traders while protecting liquidity providers.

Lending

Lending and borrowing are the pillars of the current financial system. Several protocols today operate overcollateralized lending on Ethereum, but we believe a much larger market exists in undercollateralized lending coupled with credit (reputation) and identity.
Cap Lending will be built in conjunction with a decentralized reputation system that will allow lenders to evaluate borrowers from a risk perspective on a case by case basis. Decentralized, unsecured lending is not trivial but we believe we’ve come up with a design to make it work.

More

Several other services will also be built under the Cap umbrella. Those include staking, asset management, prediction markets, identity, insurance, and gaming. Admittedly, many of the implementation details for these products are still TBD, but we trust we will figure them out in due time.
We also plan to build traditional infrastructure services that interact directly with smart contracts on the blockchain, including oracles, to support Cap’s smart contracts without needing to rely on and pay for an external third party in the long run. Cap Oracles will be an independently run entity.
CAP is the native token underpinning the entire ecosystem. CAP holders will not only be entitled to the global yield generated by the system, but they will also be able to vote on system parameters and asset offerings, among other things. The point of Cap is to create a financial system governed by its users and the CAP token makes that possible.
Cap’s values are:
We plan to build out Cap with those values in mind. Focusing on the core features that matter to most users. Continuously engaging with the community in a transparent manner. Getting high-impact smart contracts audited by competent third parties prior to wide usage. And more.
Cap is community driven. Join us on Telegram."
submitted by FriendlyTemperature to CryptoMoonShots [link] [comments]

Tutorial: Ethereum RPCs, Methods and Calls

JSON RPC, methods, calls, requests - what does it all mean?! When you start building a dapp on the Ethereum blockchain, you’re introduced to a host of new concepts, request methods and naming conventions to employ - it can be overwhelming. The Infura team are experts in web3 infrastructure. We build open source tools and materials to help more developers interact with Ethereum and IPFS. In this tutorial, we leverage the collective experience of our team to bring you an in-depth guide to reading and writing requests to the Ethereum blockchain, using Infura.
submitted by infura to eth [link] [comments]

What the minimun amount of nodes to run a blog on IPFS-Cluster?

Hi guys
I'm looking for setting up an ipfs cluster on aws ec2 to build a descentralized version of my blog. But I just dont know the minimum amounts of nodes( ec2 instances ) to run my blog. I read something related to the Raft algorithm or something like that and in that post the author said ' the minimum amount of nodes is above 3' to safe the permanently of the data in case of any node fall.
Another thought, A few minute ago I discover that you can get a decentralized domain names like ones delivered by ethereum name services( ENS ). What other blockchain project are doing the same? I mean , it's really a awesome moment to learn about this stuffs!!
submitted by Neumma to ipfs [link] [comments]

RESEARCH REPORT ABOUT KYBER NETWORK

RESEARCH REPORT ABOUT KYBER NETWORK
Author: Gamals Ahmed, CoinEx Business Ambassador

https://preview.redd.it/9k31yy1bdcg51.jpg?width=936&format=pjpg&auto=webp&s=99bcb7c3f50b272b7d97247b369848b5d8cc6053

ABSTRACT

In this research report, we present a study on Kyber Network. Kyber Network is a decentralized, on-chain liquidity protocol designed to make trading tokens simple, efficient, robust and secure.
Kyber design allows any party to contribute to an aggregated pool of liquidity within each blockchain while providing a single endpoint for takers to execute trades using the best rates available. We envision a connected liquidity network that facilitates seamless, decentralized cross-chain token swaps across Kyber based networks on different chains.
Kyber is a fully on-chain liquidity protocol that enables decentralized exchange of cryptocurrencies in any application. Liquidity providers (Reserves) are integrated into one single endpoint for takers and users. When a user requests a trade, the protocol will scan the entire network to find the reserve with the best price and take liquidity from that particular reserve.

1.INTRODUCTION

DeFi applications all need access to good liquidity sources, which is a critical component to provide good services. Currently, decentralized liquidity is comprised of various sources including DEXes (Uniswap, OasisDEX, Bancor), decentralized funds and other financial apps. The more scattered the sources, the harder it becomes for anyone to either find the best rate for their trade or to even find enough liquidity for their need.
Kyber is a blockchain-based liquidity protocol that aggregates liquidity from a wide range of reserves, powering instant and secure token exchange in any decentralized application.
The protocol allows for a wide range of implementation possibilities for liquidity providers, allowing a wide range of entities to contribute liquidity, including end users, decentralized exchanges and other decentralized protocols. On the taker side, end users, cryptocurrency wallets, and smart contracts are able to perform instant and trustless token trades at the best rates available amongst the sources.
The Kyber Network is project based on the Ethereum protocol that seeks to completely decentralize the exchange of crypto currencies and make exchange trustless by keeping everything on the blockchain.
Through the Kyber Network, users should be able to instantly convert or exchange any crypto currency.

1.1 OVERVIEW ABOUT KYBER NETWORK PROTOCOL

The Kyber Network is a decentralized way to exchange ETH and different ERC20 tokens instantly — no waiting and no registration needed.
Using this protocol, developers can build innovative payment flows and applications, including instant token swap services, ERC20 payments, and financial DApps — helping to build a world where any token is usable anywhere.
Kyber’s fully on-chain design allows for full transparency and verifiability in the matching engine, as well as seamless composability with DApps, not all of which are possible with off-chain or hybrid approaches. The integration of a large variety of liquidity providers also makes Kyber uniquely capable of supporting sophisticated schemes and catering to the needs of DeFi DApps and financial institutions. Hence, many developers leverage Kyber’s liquidity pool to build innovative financial applications, and not surprisingly, Kyber is the most used DeFi protocol in the world.
The Kyber Network is quite an established project that is trying to change the way we think of decentralised crypto currency exchange.
The Kyber Network has seen very rapid development. After being announced in May 2017 the testnet for the Kyber Network went live in August 2017. An ICO followed in September 2017, with the company raising 200,000 ETH valued at $60 million in just one day.
The live main net was released in February 2018 to whitelisted participants, and on March 19, 2018, the Kyber Network opened the main net as a public beta. Since then the network has seen increasing growth, with network volumes growing more than 500% in the first half of 2019.
Although there was a modest decrease in August 2019 that can be attributed to the price of ETH dropping by 50%, impacting the overall total volumes being traded and processed globally.
They are developing a decentralised exchange protocol that will allow developers to build payment flows and financial apps. This is indeed quite a competitive market as a number of other such protocols have been launched.
In Brief - Kyber Network is a tool that allows anyone to swap tokens instantly without having to use exchanges. - It allows vendors to accept different types of cryptocurrency while still being paid in their preferred crypto of choice. - It’s built primarily for Ethereum, but any smart-contract based blockchain can incorporate it.
At its core, Kyber is a decentralized way to exchange ETH and different ERC20 tokens instantly–no waiting and no registration needed. To do this Kyber uses a diverse set of liquidity pools, or pools of different crypto assets called “reserves” that any project can tap into or integrate with.
A typical use case would be if a vendor allowed customers to pay in whatever currency they wish, but receive the payment in their preferred token. Another example would be for Dapp users. At present, if you are not a token holder of a certain Dapp you can’t use it. With Kyber, you could use your existing tokens, instantly swap them for the Dapp specific token and away you go.
All this swapping happens directly on the Ethereum blockchain, meaning every transaction is completely transparent.

1.1.1 WHY BUILD THE KYBER NETWORK?

While crypto currencies were built to be decentralized, many of the exchanges for trading crypto currencies have become centralized affairs. This has led to security vulnerabilities, with many exchanges becoming the victims of hacking and theft.
It has also led to increased fees and costs, and the centralized exchanges often come with slow transfer times as well. In some cases, wallets have been locked and users are unable to withdraw their coins.
Decentralized exchanges have popped up recently to address the flaws in the centralized exchanges, but they have their own flaws, most notably a lack of liquidity, and often times high costs to modify trades in their on-chain order books.

Some of the Integrations with Kyber Protocol
The Kyber Network was formed to provide users with a decentralized exchange that keeps everything right on the blockchain, and uses a reserve system rather than an order book to provide high liquidity at all times. This will allow for the exchange and transfer of any cryptocurrency, even cross exchanges, and costs will be kept at a minimum as well.
The Kyber Network has three guiding design philosophies since the start:
  1. To be most useful the network needs to be platform-agnostic, which allows any protocol or application the ability to take advantage of the liquidity provided by the Kyber Network without any impact on innovation.
  2. The network was designed to make real-world commerce and decentralized financial products not only possible but also feasible. It does this by allowing for instant token exchange across a wide range of tokens, and without any settlement risk.
  3. The Kyber Network was created with ease of integration as a priority, which is why everything runs fully on-chain and fully transparent. Kyber is not only developer-friendly, but is also compatible with a wide variety of systems.

1.1.2 WHO INVENTED KYBER?

Kyber’s founders are Loi Luu, Victor Tran, Yaron Velner — CEO, CTO, and advisor to the Kyber Network.

1.1.3 WHAT DISTINGUISHES KYBER?

Kyber’s mission has always been to integrate with other protocols so they’ve focused on being developer-friendly by providing architecture to allow anyone to incorporate the technology onto any smart-contract powered blockchain. As a result, a variety of different dapps, vendors, and wallets use Kyber’s infrastructure including Set Protocol, bZx, InstaDApp, and Coinbase wallet.
Besides, dapps, vendors, and wallets, Kyber also integrates with other exchanges such as Uniswap — sharing liquidity pools between the two protocols.
A typical use case would be if a vendor allowed customers to pay in whatever currency they wish, but receive the payment in their preferred token. Another example would be for Dapp users. At present, if you are not a token holder of a certain Dapp you can’t use it. With Kyber, you could use your existing tokens, instantly swap them for the Dapp specific token and away you go.
Limit orders on Kyber allow users to set a specific price in which they would like to exchange a token instead of accepting whatever price currently exists at the time of trading. However, unlike with other exchanges, users never lose custody of their crypto assets during limit orders on Kyber.
The Kyber protocol works by using pools of crypto funds called “reserves”, which currently support over 70 different ERC20 tokens. Reserves are essentially smart contracts with a pool of funds. Different parties with different prices and levels of funding control all reserves. Instead of using order books to match buyers and sellers to return the best price, the Kyber protocol looks at all the reserves and returns the best price among the different reserves. Reserves make money on the “spread” or differences between the buying and selling prices. The Kyber wants any token holder to easily convert one token to another with a minimum of fuss.

1.2 KYBER PROTOCOL

The protocol smart contracts offer a single interface for the best available token exchange rates to be taken from an aggregated liquidity pool across diverse sources. ● Aggregated liquidity pool. The protocol aggregates various liquidity sources into one liquidity pool, making it easy for takers to find the best rates offered with one function call. ● Diverse sources of liquidity. The protocol allows different types of liquidity sources to be plugged into. Liquidity providers may employ different strategies and different implementations to contribute liquidity to the protocol. ● Permissionless. The protocol is designed to be permissionless where any developer can set up various types of reserves, and any end user can contribute liquidity. Implementations need to take into consideration various security vectors, such as reserve spamming, but can be mitigated through a staking mechanism. We can expect implementations to be permissioned initially until the maintainers are confident about these considerations.
The core feature that the Kyber protocol facilitates is the token swap between taker and liquidity sources. The protocol aims to provide the following properties for token trades: ● Instant Settlement. Takers do not have to wait for their orders to be fulfilled, since trade matching and settlement occurs in a single blockchain transaction. This enables trades to be part of a series of actions happening in a single smart contract function. ● Atomicity. When takers make a trade request, their trade either gets fully executed, or is reverted. This “all or nothing” aspect means that takers are not exposed to the risk of partial trade execution. ● Public rate verification. Anyone can verify the rates that are being offered by reserves and have their trades instantly settled just by querying from the smart contracts. ● Ease of integration. Trustless and atomic token trades can be directly and easily integrated into other smart contracts, thereby enabling multiple trades to be performed in a smart contract function.
How each actor works is specified in Section Network Actors. 1. Takers refer to anyone who can directly call the smart contract functions to trade tokens, such as end-users, DApps, and wallets. 2. Reserves refer to anyone who wishes to provide liquidity. They have to implement the smart contract functions defined in the reserve interface in order to be registered and have their token pairs listed. 3. Registered reserves refer to those that will be cycled through for matching taker requests. 4. Maintainers refer to anyone who has permission to access the functions for the adding/removing of reserves and token pairs, such as a DAO or the team behind the protocol implementation. 5. In all, they comprise of the network, which refers to all the actors involved in any given implementation of the protocol.
The protocol implementation needs to have the following: 1. Functions for takers to check rates and execute the trades 2. Functions for the maintainers to registeremove reserves and token pairs 3. Reserve interface that defines the functions reserves needs to implement
https://preview.redd.it/d2tcxc7wdcg51.png?width=700&format=png&auto=webp&s=b2afde388a77054e6731772b9115ee53f09b6a4a

1.3 KYBER CORE SMART CONTRACTS

Kyber Core smart contracts is an implementation of the protocol that has major protocol functions to allow actors to join and interact with the network. For example, the Kyber Core smart contracts provide functions for the listing and delisting of reserves and trading pairs by having clear interfaces for the reserves to comply to be able to register to the network and adding support for new trading pairs. In addition, the Kyber Core smart contracts also provide a function for takers to query the best rate among all the registered reserves, and perform the trades with the corresponding rate and reserve. A trading pair consists of a quote token and any other token that the reserve wishes to support. The quote token is the token that is either traded from or to for all trades. For example, the Ethereum implementation of the Kyber protocol uses Ether as the quote token.
In order to search for the best rate, all reserves supporting the requested token pair will be iterated through. Hence, the Kyber Core smart contracts need to have this search algorithm implemented.
The key functions implemented in the Kyber Core Smart Contracts are listed in Figure 2 below. We will visit and explain the implementation details and security considerations of each function in the Specification Section.

1.4 HOW KYBER’S ON-CHAIN PROTOCOL WORKS?

Kyber is the liquidity infrastructure for decentralized finance. Kyber aggregates liquidity from diverse sources into a pool, which provides the best rates for takers such as DApps, Wallets, DEXs, and End users.

1.4.1 PROVIDING LIQUIDITY AS A RESERVE

Anyone can operate a Kyber Reserve to market make for profit and make their tokens available for DApps in the ecosystem. Through an open reserve architecture, individuals, token teams and professional market makers can contribute token assets to Kyber’s liquidity pool and earn from the spread in every trade. These tokens become available at the best rates across DApps that tap into the network, making them instantly more liquid and useful.
MAIN RESERVE TYPES Kyber currently has over 45 reserves in its network providing liquidity. There are 3 main types of reserves that allow different liquidity contribution options to suit the unique needs of different providers. 1. Automated Price Reserves (APR) — Allows token teams and users with large token holdings to have an automated yet customized pricing system with low maintenance costs. Synthetix and Melon are examples of teams that run APRs. 2. Fed Price Reserves (FPR) — Operated by professional market makers that require custom and advanced pricing strategies tailored to their specific needs. Kyber alongside reserves such as OneBit, runs FPRs. 3. Bridge Reserves (BR) — These are specialized reserves meant to bring liquidity from other on-chain liquidity providers like Uniswap, Oasis, DutchX, and Bancor into the network.

1.5 KYBER NETWORK ROLES

There Kyber Network functions through coordination between several different roles and functions as explained below: - Users — This entity uses the Kyber Network to send and receive tokens. A user can be an individual, a merchant, and even a smart contract account. - Reserve Entities — This role is used to add liquidity to the platform through the dynamic reserve pool. Some reserve entities are internal to the Kyber Network, but others may be registered third parties. Reserve entities may be public if the public contributes to the reserves they hold, otherwise they are considered private. By allowing third parties as reserve entities the network adds diversity, which prevents monopolization and keeps exchange rates competitive. Allowing third party reserve entities also allows for the listing of less popular coins with lower volumes. - Reserve Contributors — Where reserve entities are classified as public, the reserve contributor is the entity providing reserve funds. Their incentive for doing so is a profit share from the reserve. - The Reserve Manager — Maintains the reserve, calculates exchange rates and enters them into the network. The reserve manager profits from exchange spreads set by them on their reserves. They can also benefit from increasing volume by accessing the entire Kyber Network. - The Kyber Network Operator — Currently the Kyber Network team is filling the role of the network operator, which has a function to adds/remove Reserve Entities as well as controlling the listing of tokens. Eventually, this role will revert to a proper decentralized governance.

1.6 BASIC TOKEN TRADE

A basic token trade is one that has the quote token as either the source or destination token of the trade request. The execution flow of a basic token trade is depicted in the diagram below, where a taker would like to exchange BAT tokens for ETH as an example. The trade happens in a single blockchain transaction. 1. Taker sends 1 ETH to the protocol contract, and would like to receive BAT in return. 2. Protocol contract queries the first reserve for its ETH to BAT exchange rate. 3. Reserve 1 offers an exchange rate of 1 ETH for 800 BAT. 4. Protocol contract queries the second reserve for its ETH to BAT exchange rate. 5. Reserve 2 offers an exchange rate of 1 ETH for 820 BAT. 6. This process goes on for the other reserves. After the iteration, reserve 2 is discovered to have offered the best ETH to BAT exchange rate. 7. Protocol contract sends 1 ETH to reserve 2. 8. The reserve sends 820 BAT to the taker.

1.7 TOKEN-TO-TOKEN TRADE

A token-to-token trade is one where the quote token is neither the source nor the destination token of the trade request. The exchange flow of a token to token trade is depicted in the diagram below, where a taker would like to exchange BAT tokens for DAI as an example. The trade happens in a single blockchain transaction. 1. Taker sends 50 BAT to the protocol contract, and would like to receive DAI in return. 2. Protocol contract sends 50 BAT to the reserve offering the best BAT to ETH rate. 3. Protocol contract receives 1 ETH in return. 4. Protocol contract sends 1 ETH to the reserve offering the best ETH to DAI rate. 5. Protocol contract receives 30 DAI in return. 6. Protocol contract sends 30 DAI to the user.

2.KYBER NETWORK CRYSTAL (KNC) TOKEN

Kyber Network Crystal (KNC) is an ERC-20 utility token and an integral part of Kyber Network.
KNC is the first deflationary staking token where staking rewards and token burns are generated from actual network usage and growth in DeFi.
The Kyber Network Crystal (KNC) is the backbone of the Kyber Network. It works to connect liquidity providers and those who need liquidity and serves three distinct purposes. The first of these is to collect transaction fees, and a portion of every fee collected is burned, which keeps KNC deflationary. Kyber Network Crystals (KNC), are named after the crystals in Star Wars used to power light sabers.
The KNC also ensures the smooth operation of the reserve system in the Kyber liquidity since entities must use third-party tokens to buy the KNC that pays for their operations in the network.
KNC allows token holders to play a critical role in determining the incentive system, building a wide base of stakeholders, and facilitating economic flow in the network. A small fee is charged each time a token exchange happens on the network, and KNC holders get to vote on this fee model and distribution, as well as other important decisions. Over time, as more trades are executed, additional fees will be generated for staking rewards and reserve rebates, while more KNC will be burned. - Participation rewards — KNC holders can stake KNC in the KyberDAO and vote on key parameters. Voters will earn staking rewards (in ETH) - Burning — Some of the network fees will be burned to reduce KNC supply permanently, providing long-term value accrual from decreasing supply. - Reserve incentives — KNC holders determine the portion of network fees that are used as rebates for selected liquidity providers (reserves) based on their volume performance.

Finally, the KNC token is the connection between the Kyber Network and the exchanges, wallets, and dApps that leverage the liquidity network. This is a virtuous system since entities are rewarded with referral fees for directing more users to the Kyber Network, which helps increase adoption for Kyber and for the entities using the Network.
And of course there will soon be a fourth and fifth uses for the KNC, which will be as a staking token used to generate passive income, as well as a governance token used to vote on key parameters of the network.
The Kyber Network Crystal (KNC) was released in a September 2017 ICO at a price around $1. There were 226,000,000 KNC minted for the ICO, with 61% sold to the public. The remaining 39% are controlled 50/50 by the company and the founders/advisors, with a 1 year lockup period and 2 year vesting period.
Currently, just over 180 million coins are in circulation, and the total supply has been reduced to 210.94 million after the company burned 1 millionth KNC token in May 2019 and then its second millionth KNC token just three months later.
That means that while it took 15 months to burn the first million KNC, it took just 10 weeks to burn the second million KNC. That shows how rapidly adoption has been growing recently for Kyber, with July 2019 USD trading volumes on the Kyber Network nearly reaching $60 million. This volume has continued growing, and on march 13, 2020 the network experienced its highest daily trading activity of $33.7 million in a 24-hour period.
Currently KNC is required by Reserve Managers to operate on the network, which ensures a minimum amount of demand for the token. Combined with future plans for burning coins, price is expected to maintain an upward bias, although it has suffered along with the broader market in 2018 and more recently during the summer of 2019.
It was unfortunate in 2020 that a beginning rally was cut short by the coronavirus pandemic, although the token has stabilized as of April 2020, and there are hopes the rally could resume in the summer of 2020.

2.1 HOW ARE KNC TOKENS PRODUCED?

The native token of Kyber is called Kyber Network Crystals (KNC). All reserves are required to pay fees in KNC for the right to manage reserves. The KNC collected as fees are either burned and taken out of the total supply or awarded to integrated dapps as an incentive to help them grow.

2.2 HOW DO YOU GET HOLD OF KNC TOKENS?

Kyber Swap can be used to buy ETH directly using a credit card, which can then be used to swap for KNC. Besides Kyber itself, exchanges such as Binance, Huobi, and OKex trade KNC.

2.3 WHAT CAN YOU DO WITH KYBER?

The most direct and basic function of Kyber is for instantly swapping tokens without registering an account, which anyone can do using an Etheruem wallet such as MetaMask. Users can also create their own reserves and contribute funds to a reserve, but that process is still fairly technical one–something Kyber is working on making easier for users in the future.

2.4 THE GOAL OF KYBER THE FUTURE

The goal of Kyber in the coming years is to solidify its position as a one-stop solution for powering liquidity and token swapping on Ethereum. Kyber plans on a major protocol upgrade called Katalyst, which will create new incentives and growth opportunities for all stakeholders in their ecosystem, especially KNC holders. The upgrade will mean more use cases for KNC including to use KNC to vote on governance decisions through a decentralized organization (DAO) called the KyberDAO.
With our upcoming Katalyst protocol upgrade and new KNC model, Kyber will provide even more benefits for stakeholders. For instance, reserves will no longer need to hold a KNC balance for fees, removing a major friction point, and there will be rebates for top performing reserves. KNC holders can also stake their KNC to participate in governance and receive rewards.

2.5 BUYING & STORING KNC

Those interested in buying KNC tokens can do so at a number of exchanges. Perhaps your best bet between the complete list is the likes of Coinbase Pro and Binance. The former is based in the USA whereas the latter is an offshore exchange.
The trading volume is well spread out at these exchanges, which means that the liquidity is not concentrated and dependent on any one exchange. You also have decent liquidity on each of the exchange books. For example, the Binance BTC / KNC books are wide and there is decent turnover. This means easier order execution.
KNC is an ERC20 token and can be stored in any wallet with ERC20 support, such as MyEtherWallet or MetaMask. One interesting alternative is the KyberSwap Android mobile app that was released in August 2019.
It allows for instant swapping of tokens and has support for over 70 different altcoins. It also allows users to set price alerts and limit orders and works as a full-featured Ethereum wallet.

2.6 KYBER KATALYST UPGRADE

Kyber has announced their intention to become the de facto liquidity layer for the Decentralized Finance space, aiming to have Kyber as the single on-chain endpoint used by the majority of liquidity providers and dApp developers. In order to achieve this goal the Kyber Network team is looking to create an open ecosystem that garners trust from the decentralized finance space. They believe this is the path that will lead the majority of projects, developers, and users to choose Kyber for liquidity needs. With that in mind they have recently announced the launch of a protocol upgrade to Kyber which is being called Katalyst.
The Katalyst upgrade will create a stronger ecosystem by creating strong alignments towards a common goal, while also strengthening the incentives for stakeholders to participate in the ecosystem.
The primary beneficiaries of the Katalyst upgrade will be the three major Kyber stakeholders: 1. Reserve managers who provide network liquidity; 2. dApps that connect takers to Kyber; 3. KNC holders.
These stakeholders can expect to see benefits as highlighted below: Reserve Managers will see two new benefits to providing liquidity for the network. The first of these benefits will be incentives for providing reserves. Once Katalyst is implemented part of the fees collected will go to the reserve managers as an incentive for providing liquidity.
This mechanism is similar to rebates in traditional finance, and is expected to drive the creation of additional reserves and market making, which in turn will lead to greater liquidity and platform reach.
Katalyst will also do away with the need for reserve managers to maintain a KNC balance for use as network fees. Instead fees will be automatically collected and used as incentives or burned as appropriate. This should remove a great deal of friction for reserves to connect with Kyber without affecting the competitive exchange rates that takers in the system enjoy. dApp Integrators will now be able to set their own spread, which will give them full control over their own business model. This means the current fee sharing program that shares 30% of the 0.25% fee with dApp developers will go away and developers will determine their own spread. It’s believed this will increase dApp development within Kyber as developers will now be in control of fees.
KNC Holders, often thought of as the core of the Kyber Network, will be able to take advantage of a new staking mechanism that will allow them to receive a portion of network fees by staking their KNC and participating in the KyberDAO.

2.7 COMING KYBERDAO

With the implementation of the Katalyst protocol the KNC holders will be put right at the heart of Kyber. Holders of KNC tokens will now have a critical role to play in determining the future economic flow of the network, including its incentive systems.
The primary way this will be achieved is through KyberDAO, a way in which on-chain and off-chain governance will align to streamline cooperation between the Kyber team, KNC holders, and market participants.
The Kyber Network team has identified 3 key areas of consideration for the KyberDAO: 1. Broad representation, transparent governance and network stability 2. Strong incentives for KNC holders to maintain their stake and be highly involved in governance 3. Maximizing participation with a wide range of options for voting delegation
Interaction between KNC Holders & Kyber
This means KNC holders have been empowered to determine the network fee and how to allocate the fees to ensure maximum network growth. KNC holders will now have three fee allocation options to vote on: - Voting Rewards: Immediate value creation. Holders who stake and participate in the KyberDAO get their share of the fees designated for rewards. - Burning: Long term value accrual. The decreasing supply of KNC will improve the token appreciation over time and benefit those who did not participate. - Reserve Incentives:Value creation via network growth. By rewarding Kyber reserve managers based on their performance, it helps to drive greater volume, value, and network fees.

2.8 TRANSPARENCY AND STABILITY

The design of the KyberDAO is meant to allow for the greatest network stability, as well as maximum transparency and the ability to quickly recover in emergency situations. Initally the Kyber team will remain as maintainers of the KyberDAO. The system is being developed to be as verifiable as possible, while still maintaining maximum transparency regarding the role of the maintainer in the DAO.
Part of this transparency means that all data and processes are stored on-chain if feasible. Voting regarding network fees and allocations will be done on-chain and will be immutable. In situations where on-chain storage or execution is not feasible there will be a set of off-chain governance processes developed to ensure all decisions are followed through on.

2.9 KNC STAKING AND DELEGATION

Staking will be a new addition and both staking and voting will be done in fixed periods of times called “epochs”. These epochs will be measured in Ethereum block times, and each KyberDAO epoch will last roughly 2 weeks.
This is a relatively rapid epoch and it is beneficial in that it gives more rapid DAO conclusion and decision-making, while also conferring faster reward distribution. On the downside it means there needs to be a new voting campaign every two weeks, which requires more frequent participation from KNC stakeholders, as well as more work from the Kyber team.
Delegation will be part of the protocol, allowing stakers to delegate their voting rights to third-party pools or other entities. The pools receiving the delegation rights will be free to determine their own fee structure and voting decisions. Because the pools will share in rewards, and because their voting decisions will be clearly visible on-chain, it is expected that they will continue to work to the benefit of the network.

3. TRADING

After the September 2017 ICO, KNC settled into a trading price that hovered around $1.00 (decreasing in BTC value) until December. The token has followed the trend of most other altcoins — rising in price through December and sharply declining toward the beginning of January 2018.
The KNC price fell throughout all of 2018 with one exception during April. From April 6th to April 28th, the price rose over 200 percent. This run-up coincided with a blog post outlining plans to bring Bitcoin to the Ethereum blockchain. Since then, however, the price has steadily fallen, currently resting on what looks like a $0.15 (~0.000045 BTC) floor.
With the number of partners using the Kyber Network, the price may rise as they begin to fully use the network. The development team has consistently hit the milestones they’ve set out to achieve, so make note of any release announcements on the horizon.

4. COMPETITION

The 0x project is the biggest competitor to Kyber Network. Both teams are attempting to enter the decentralized exchange market. The primary difference between the two is that Kyber performs the entire exchange process on-chain while 0x keeps the order book and matching off-chain.
As a crypto swap exchange, the platform also competes with ShapeShift and Changelly.

5.KYBER MILESTONES

• June 2020: Digifox, an all-in-one finance application by popular crypto trader and Youtuber Nicholas Merten a.k.a DataDash (340K subs), integrated Kyber to enable users to easily swap between cryptocurrencies without having to leave the application. • June 2020: Stake Capital partnered with Kyber to provide convenient KNC staking and delegation services, and also took a KNC position to participate in governance. • June 2020: Outlined the benefits of the Fed Price Reserve (FPR) for professional market makers and advanced developers. • May 2020: Kyber crossed US$1 Billion in total trading volume and 1 Million transactions, performed entirely on-chain on Ethereum. • May 2020: StakeWith.Us partnered Kyber Network as a KyberDAO Pool Master. • May 2020: 2Key, a popular blockchain referral solution using smart links, integrated Kyber’s on-chain liquidity protocol for seamless token swaps • May 2020: Blockchain game League of Kingdoms integrated Kyber to accept Token Payments for Land NFTs. • May 2020: Joined the Zcash Developer Alliance , an invite-only working group to advance Zcash development and interoperability. • May 2020: Joined the Chicago DeFi Alliance to help accelerate on-chain market making for professionals and developers. • March 2020: Set a new record of USD $33.7M in 24H fully on-chain trading volume, and $190M in 30 day on-chain trading volume. • March 2020: Integrated by Rarible, Bullionix, and Unstoppable Domains, with the KyberWidget deployed on IPFS, which allows anyone to swap tokens through Kyber without being blocked. • February 2020: Popular Ethereum blockchain game Axie Infinity integrated Kyber to accept ERC20 payments for NFT game items. • February 2020: Kyber’s protocol was integrated by Gelato Finance, Idle Finance, rTrees, Sablier, and 0x API for their liquidity needs. • January 2020: Kyber Network was found to be the most used protocol in the whole decentralized finance (DeFi) space in 2019, according to a DeFi research report by Binance. • December 2019: Switcheo integrated Kyber’s protocol for enhanced liquidity on their own DEX. • December 2019: DeFi Wallet Eidoo integrated Kyber for seamless in-wallet token swaps. • December 2019: Announced the development of the Katalyst Protocol Upgrade and new KNC token model. • July 2019: Developed the Waterloo Bridge , a Decentralized Practical Cross-chain Bridge between EOS and Ethereum, successfully demonstrating a token swap between Ethereum to EOS. • July 2019: Trust Wallet, the official Binance wallet, integrated Kyber as part of its decentralized token exchange service, allowing even more seamless in-wallet token swaps for thousands of users around the world. • May 2019: HTC, the large consumer electronics company with more than 20 years of innovation, integrated Kyber into its Zion Vault Wallet on EXODUS 1 , the first native web 3.0 blockchain phone, allowing users to easily swap between cryptocurrencies in a decentralized manner without leaving the wallet. • January 2019: Introduced the Automated Price Reserve (APR) , a capital efficient way for token teams and individuals to market make with low slippage. • January 2019: The popular Enjin Wallet, a default blockchain DApp on the Samsung S10 and S20 mobile phones, integrated Kyber to enable in-wallet token swaps. • October 2018: Kyber was a founding member of the WBTC (Wrapped Bitcoin) Initiative and DAO. • October 2018: Developed the KyberWidget for ERC20 token swaps on any website, with CoinGecko being the first major project to use it on their popular site.

Full Article

submitted by CoinEx_Institution to kybernetwork [link] [comments]

Tutorial: Ethereum RPCs, Methods and Calls

JSON RPC, methods, calls, requests - what does it all mean?! When you start building a dapp on the Ethereum blockchain, you’re introduced to a host of new concepts, request methods and naming conventions to employ - it can be overwhelming. The Infura team are experts in web3 infrastructure. We build open source tools and materials to help more developers interact with Ethereum and IPFS. In this tutorial, we leverage the collective experience of our team to bring you an in-depth guide to reading and writing requests to the Ethereum blockchain, using Infura.
submitted by infura to ethdev [link] [comments]

$1500 Decentralized Blog Contest by Unstoppable Domains!

Hey everyone,
We recently released a blog template that allows any user to create and deploy a blog to IPFS in a few minutes.
To celebrate, we are launching a competition to reward the top 15 decentralized blogs from a pool of $1,500 Ethereum!
Rules:
  1. Create a decentralized blog using our template.
  2. Reply with your .crypto domain on this Twitter post before June 24th at 9 AM EST.
  3. 15 winning blogs will be announced on June 25th. Each winning blog will receive $100 in ETH.
Feel free to write about any topic you are interested in. We are looking for original, creative, insightful content.
How can I create a decentralized blog?
We have a video and a text guide that explains how you can create a decentralized blog with a few clicks.
Looking for inspiration?
Here are a handful of blogs that have been released so far:
(Use Opera for Android, the Unstoppable Extension for Chrome, Edge, and Brave, or the Unstoppable Blockchain Browser to view the blogs)
Don't have a .crypto domain? You can get one for free by participating in this giveaway with Opera.
submitted by MagoCrypto to CryptoCurrency [link] [comments]

Decentralized Blog Contest

Hello everyone,
We recently released a blog template that allows any user to create and deploy a blog to IPFS in a few minutes.
To celebrate, we are launching a competition to reward the top 15 decentralized blogs from a pool of $1,500 Ethereum!
Rules:
  1. Create a decentralized blog using our template.
  2. Comment your .crypto blog on this Twitter post before June 24th 9 AM EST.
  3. 15 winning blogs will be announced on June 25th. Each winning blog will receive $100 in ETH.
Feel free to write about any topic you are interested in.
How can I create a decentralized blog?
We have a video and a text guide that explains how you can create a decentralized blog with a few clicks.
Don't have a .crypto domain? You can grab one for free by participating in this giveaway with Opera.
submitted by MagoCrypto to ethtrader [link] [comments]

7 legendary and most successful ICOs in cryptocurrency history

Let's find out which companies have succeeded with the initial token offering (ICO), raised as much money as planned, and fulfilled their promises to investors.
Every year, tens of billions of dollars are invested in tokens. The importance of this issue increased after Pavel Durov's TON ICO was actually outlawed by an American court. The uncertain status of cryptocurrency Libra from Mark Zuckerberg also added concern to crypto investors. So was there at least one successful ICO?
Yes, It was, and not just one. Let's arrange these cases in chronological order.

1. Mastercoin — 2013

You could hear about this cryptocurrency under the name “Omni”. This was the first registered ICO (we could call it the grandfather of ICO).
On July 31, 2013, a special fund was created for investment. About 500 people transferred 5,000 BTC into this fund. In 2013, this amount was $ 500 thousand. For the first time in the history of cryptocurrencies, the creators promised anyone who buys a Mastercoin an opportunity to use it as an investment tool. After the launch of the system, the value of coins was supposed to increase, and the holder could sell it freely.
Was this plan implemented? Yes, it was. In less than a year, Mastercoin already ranked seventh in the cryptocurrency market.
The renaming of Mastercoin to Omni took place in 2015. Now it’s not just a coin, but a Bitcoin-based platform, on which the one can trade digital assets, and also create them.

2. Ethereum (ETH) — 2014

One of the prime examples of a successful token placement campaign. In just 12 hours, it raised $2.3 million. And in September 2014 it raised $18.4 million in total.
This is how it all happened. A unique feature of the platform at that time was the smart contract system. A key feature of Ethereum is to provide a basis for other projects to build and develop their technologies.
Information about the total number of available tokens was not disclosed, but 60 million tokens were successfully sold. Global fundraising goals were not limited to anything.
Was the ICO successful and have all the promises been fulfilled? Definitely. To this day, this ICO is considered one of the most successful in history and an example of worthy crowdfunding. Ethereum lives, develops and is second in terms of capitalization after BTC. On its basis, new platforms are being built.

3. EOS Project (EOS) from block.one — 2017

This project raised $185 million for the development and implementation of a new blockchain architecture that automates financial processes and evaluates transaction parameters. It helps to create high-quality business applications.

4. Status (SNT) — 2017

Another example of brilliant success. This blockchain messenger and mobile operating system (built on Etehreum technologies) were developed to work with decentralized mobile applications. Status raised over $100 million on the first day. Promises are fulfilled, applications work and allow to use encrypted messages, smart contracts, payments, chatbots, and operate with any available ICOs. There's also a built-in currency exchange. The system allows you to store your crypto assets in a special Status wallet.

5. Bancor (BNT) — 2017

In 2016-2017 there was a real ICO boom. The Bancor project's shown even faster fundraising than its predecessors. In just 3 hours, $140 million tokens were bought. In total, BNT was sold in the amount of $153 million. Bancor's goal is to increase the liquidity of ERC-20 tokens (Etehereum) and make BNT actually reserve currency. It doesn't require any exchanges and offers its owners an investment basket. Bancor works with smart contracts and allows you to issue your tokens and link any tokens to a plastic card.
However, you can only call it successful with some limitations. It's restricted in the USA, and there are questions about the tokens that rotate on this platform. However, outside of America, people make BNT transactions, which means that it can’t be called a failure (Gram's also banned in the USA yet).

6. Tezos (XTZ) — 2017

For the first 5 days, the Swiss company Tezos raised $137 million through ICO. The total amount of token sale was about $230 million. This placement is rightfully considered one of the most successful in crypto history.
The project offers a flexible alternative system of smart contracts and is opposed to the Ethereum system on which many companies build their networks.

7. Filecoin (FIL) — 2017

In 2014, Protocol Labs launched this system as part of a secure and reliable data storage program based on IPFS protocol (InterPlanetary File System). The regulated ICO of 2017 showed excellent results with the requested $40 million. It was possible to raise $257 million, i.e. almost 6.5 times more.
After the boom in 2016-2017, there were many successful ICOs, but these seven placements were most memorable.

Where you can see all ICOs yourself

There're several useful resources that allow you to get information about the active and upcoming placement of tokens (without investment recommendations) and an archive of past ICOs. These are ICOMARKS and ICODROPS platforms.
submitted by CoinjoyAssistant to CryptoCurrencies [link] [comments]

7 legendary and most successful ICOs in cryptocurrency history

Let's find out which companies have succeeded with the initial token offering (ICO), raised as much money as planned, and fulfilled their promises to investors.
Every year, tens of billions of dollars are invested in tokens. The importance of this issue increased after Pavel Durov's TON ICO was actually outlawed by an American court. The uncertain status of cryptocurrency Libra from Mark Zuckerberg also added concern to crypto investors. So was there at least one successful ICO?
Yes, It was, and not just one. Let's arrange these cases in chronological order.

1. Mastercoin — 2013

You could hear about this cryptocurrency under the name “Omni”. This was the first registered ICO (we could call it the grandfather of ICO).
On July 31, 2013, a special fund was created for investment. About 500 people transferred 5,000 BTC into this fund. In 2013, this amount was $ 500 thousand. For the first time in the history of cryptocurrencies, the creators promised anyone who buys a Mastercoin an opportunity to use it as an investment tool. After the launch of the system, the value of coins was supposed to increase, and the holder could sell it freely.
Was this plan implemented? Yes, it was. In less than a year, Mastercoin already ranked seventh in the cryptocurrency market.
The renaming of Mastercoin to Omni took place in 2015. Now it’s not just a coin, but a Bitcoin-based platform, on which the one can trade digital assets, and also create them.

2. Ethereum (ETH) — 2014

One of the prime examples of a successful token placement campaign. In just 12 hours, it raised $2.3 million. And in September 2014 it raised $18.4 million in total.
This is how it all happened. A unique feature of the platform at that time was the smart contract system. A key feature of Ethereum is to provide a basis for other projects to build and develop their technologies.
Information about the total number of available tokens was not disclosed, but 60 million tokens were successfully sold. Global fundraising goals were not limited to anything.
Was the ICO successful and have all the promises been fulfilled? Definitely. To this day, this ICO is considered one of the most successful in history and an example of worthy crowdfunding. Ethereum lives, develops and is second in terms of capitalization after BTC. On its basis, new platforms are being built.

3. EOS Project (EOS) from block.one — 2017

This project raised $185 million for the development and implementation of a new blockchain architecture that automates financial processes and evaluates transaction parameters. It helps to create high-quality business applications.

4. Status (SNT) — 2017

Another example of brilliant success. This blockchain messenger and mobile operating system (built on Etehreum technologies) were developed to work with decentralized mobile applications. Status raised over $100 million on the first day. Promises are fulfilled, applications work and allow to use encrypted messages, smart contracts, payments, chatbots, and operate with any available ICOs. There's also a built-in currency exchange. The system allows you to store your crypto assets in a special Status wallet.

5. Bancor (BNT) — 2017

In 2016-2017 there was a real ICO boom. The Bancor project's shown even faster fundraising than its predecessors. In just 3 hours, $140 million tokens were bought. In total, BNT was sold in the amount of $153 million. Bancor's goal is to increase the liquidity of ERC-20 tokens (Etehereum) and make BNT actually reserve currency. It doesn't require any exchanges and offers its owners an investment basket. Bancor works with smart contracts and allows you to issue your tokens and link any tokens to a plastic card.
However, you can only call it successful with some limitations. It's restricted in the USA, and there are questions about the tokens that rotate on this platform. However, outside of America, people make BNT transactions, which means that it can’t be called a failure (Gram's also banned in the USA yet).

6. Tezos (XTZ) — 2017

For the first 5 days, the Swiss company Tezos raised $137 million through ICO. The total amount of token sale was about $230 million. This placement is rightfully considered one of the most successful in crypto history.
The project offers a flexible alternative system of smart contracts and is opposed to the Ethereum system on which many companies build their networks.

7. Filecoin (FIL) — 2017

In 2014, Protocol Labs launched this system as part of a secure and reliable data storage program based on IPFS protocol (InterPlanetary File System). The regulated ICO of 2017 showed excellent results with the requested $40 million. It was possible to raise $257 million, i.e. almost 6.5 times more.
After the boom in 2016-2017, there were many successful ICOs, but these seven placements were most memorable.

Where you can see all ICOs yourself

There're several useful resources that allow you to get information about the active and upcoming placement of tokens (without investment recommendations) and an archive of past ICOs. These are ICOMARKS and ICODROPS platforms.
submitted by CoinjoyAssistant to ICOAnalysis [link] [comments]

Ethereum on ARM. Ethereum 1.0/2.0 ecosystem installation on Ubuntu server 64bit for Raspberry Pi 4 (step by step guide) - Join ETH 2.0 testnets through Prysm / Lighthouse clients on RPi4. Memory enhancements

Ethereum on ARM is a project that provides custom Linux images for Raspberry Pi 4 (Ethereum on ARM32 repo [1]),NanoPC-T4 [2] and RockPro64 [3] boards (Ethereum on ARM64 repo [4]) that run Geth or Parity Ethereum clients as a boot service and automatically turns these ARM devices into a full Ethereum node. The images include other components of the Ethereum ecosystem such as Status.im, Raiden, IPFS, Swarm and Vipnode as well as initial support for ETH 2.0 clients.
Images take care of all necessary steps, from setting up the environment and formatting the SSD disk to installing and running the Ethereum software as well as synchronizing the blockchain.
All you need to do is flash the MicroSD card, plug an ethernet cable, connect the SSD disk and power on the device.

The what and why of a 64 bits Raspberry Pi 4 image, 32 vs 64 bits

Following my last post [5], there are a couple of ongoing issues with the Raspberry Pi 4 that prevent Ethereum software to run as expected.
Given the massive adoption of the Raspberry Pi device, a 64 bit image would allow users to run full nodes without RAM issues and join ETH 2.0 testnets. While Raspbian certainly has a plan to migrate the OS to a full 64bit image there is not an official statement or roadmap about this. In the meanwhile, as ETH 2.0 phase 0 is around the corner, it is worth to try to find a viable alternative to run a 64bit OS on the Pi.

Installation of an Ubuntu server 64bit image for the Raspberry Pi 4 with Ethereum 1.0/2.0 ecosystem. Step by step guide.

James Chambers released an 18.04 Ubuntu server 64 bit image for the Raspberry Pi 4 using the latest Raspbian and the official 19.04 Ubuntu images [7] (amazing job, by the way). Obviously, this is work in progress, has no official support and is not considered stable. But it runs reasonably well and, as stated before, a native 64 bits image opens the door for joining the ETH 2.0 public test networks for Raspberry Pi 4 users as well as solving the 32 bits “out of memory” RAM issues. So it is worth it to give it a try.
DISCLAIMER: As this is a handcrafted image the installation it is not Plug and Play like the stable Ethereum on ARM images so you will need some Linux skills here. I tried to make the process as straightforward as possible though (it should take just 10-15 minutes).
The installation procedure has two main steps.
  1. Install Raspbian, needed to update the raspberry pi 4 to the latest (unstable) available firmware and format the USB disk
  2. Install Ubuntu 64 bits, to have a system with a 64 bit kernel and user space
INSTALL RASPBIAN
DISCLAIMER: In this step we’re going to update the firmware to the latest unstable version. It can break things and even make the Raspberry Pi unbootable (not likely at all, but possible), so, be careful here.
Prerequisites: Make sure you have the USB disk attached (blue USB port) and the network connected to the RPi 4.
1. In your Linux desktop: Open a terminal, download the Raspbian image, insert the MicroSD and flash it:
wget https://downloads.raspberrypi.org/raspbian_lite/images/raspbian_lite-2019-09-30/2019-09-26-raspbian-buster-lite.zip unzip 2019-09-26-raspbian-buster-lite.zip sudo dd bs=1M if=2019-09-26-raspbian-buster-lite.img of=/dev/mmcblk0 conv=fdatasync status=progress 
2. In your Raspberry Pi 4: Insert the MicroSD, boot up and log in (user:pi password:raspberry). Keep in mind that ssh is not enabled by default so you will need a monitoTV and a keyboard. Download Ethereum on ARM setup script to update the Firmware and format the USB 3 disk.
wget https://github.com/diglos/pi-gen/raw/ethraspbian2.0/stage2/04-ethereum/files/ethonarm-rpi4-ubuntu64bit-setup.sh sudo sh ethonarm-rpi4-ubuntu64bit-setup.sh 
  1. Power off the Raspberry and extract the MicroSD again.
INSTALL UBUNTU SERVER
1. In your Linux desktop: Insert the MicroSD. Open a terminal, download James Chambers Ubuntu image and flash it:
wget https://github.com/TheRemote/Ubuntu-Server-raspi4-unofficial/releases/download/v26/ubuntu-18.04.3-preinstalled-server-arm64+raspi4.img.xz xz -d ubuntu-18.04.3-preinstalled-server-arm64+raspi4.img.xz sudo dd bs=1M if=ubuntu-18.04.3-preinstalled-server-arm64+raspi4.img of=/dev/mmcblk0 conv=fdatasync status=progress 
2. In your Raspberry Pi 4: Insert the MicroSD, boot up and log in (user:ubuntu password:ubuntu). You will be prompted to change the password so you’ll need to log in twice. SSH is enabled by default.
3. RUN ETHEREUM ON ARM INSTALLATION SCRIPT
Download the install script and run it:
wget https://github.com/diglos/pi-gen/raw/ethraspbian2.0/stage2/04-ethereum/files/ethonarm-rpi4-ubuntu64bit-install.sh sudo sh ethonarm-rpi4-ubuntu64bit-install.sh 
4. Reboot the Pi and you will be running a full ETH 1.0 node / ETH 2.0 test client on a native Ubuntu server 64bit OS. Keep in mind that Geth and Status run by default so if you don’t want to run a full Ethereum node or a Status node you need to stop and disable them by running:

ETH 2.0 testnets on Raspberry Pi 4

Prysm and Lighthouse latest versions are now available so you can join/test the ETH 2.0 implementations with your Raspberry Pi 4:
Prysm 0.3.1: Take into account that at the time of writing this post, the sync process is very slow, see [8]. The packages provides 2 binaries: beacon-chain and validator.
Follow the instructions [9] to join the test network. Basically, get the Goerli ETH (no real value), set the validator password and run both clients. Once you have done all required steps and make sure everything goes as expected you can edit the file etc/ethereum/prysm-validator.conf and define the password previously set in order to run both clients as a systemd service:
sudo systemctl start prysm-beacon sudo systemctl start prysm-validator 
For more information about joining the Prysm testnet check the official docs [10]
Lighthouse 0.1.1: Currently provided as a single binary (no systemd service): lighthouse.
Lighthouse team is doing a fantastic progress as well [11] so expect news regarding the testnet soon. You can check their official documentation here [12]
ETH 2.0 is under heavy development right now [13] so I will be providing clients updates as soon as they are released. You can update the Eth 2.0 clients anytime by running:
sudo apt-get update && sudo apt-get install prysm-beacon prysm-validator lighthouse 
If you have the time and the skills to install the image and join the ETH 2.0 testnets, please do. It is a very valuable feedback for all teams involved, particularly at this early stage.

Memory and cache tweaks

Working on enabling swap memory for the Ubuntu 64bit image I realized that Armbian developers put a lot of work trying to maximize the memory usage on these kind of boards [14]. This is something really important with such a limited resource as it needs to be as much efficient as possible (try to avoid "out of RAM" crashes and don’t waste too much CPU as well as disk throughput in swap tasks).
Particularly, I focused on 2 tasks:
I encourage users to test these parameters and try to find the best combination in order to achieve better sync results. I'm currently running several test on the Rpi4 in order to find an optimal setup.
On top of this, I also ran some tests with the NanoPC (in this case, lowering a little the default Geth cache to 768) that resulted on Sync time improvements:
NanoPC-T4 Fast Sync data

References

  1. https://github.com/diglos/pi-gen
  2. https://www.friendlyarm.com/index.php?route=product/product&product_id=225
  3. https://store.pine64.org/?product=rockpro64-4gb-single-board-computer
  4. https://github.com/diglos/userpatches
  5. https://www.reddit.com/ethereum/comments/eehdjq/ethereum_on_arm_raspberry_pi_4_out_of_memory/
  6. https://github.com/ethereum/go-ethereum/issues/20190
  7. https://jamesachambers.com/raspberry-pi-4-ubuntu-server-desktop-18-04-3-image-unofficial/
  8. https://github.com/prysmaticlabs/prysm/issues/4508
  9. https://prylabs.net/participate
  10. https://prysmaticlabs.gitbook.io/prysm/how-prysm-works/overview-technical
  11. https://lighthouse-book.sigmaprime.io/intro.html
  12. https://twitter.com/paulhaunestatus/1217349576278999041
  13. https://blog.ethereum.org/2020/01/16/eth2-quick-update-no-7/
  14. https://forum.armbian.com/topic/5565-zram-vs-swap/
submitted by diglos76 to ethereum [link] [comments]

Ethereum ❤️ IPFS Day IPFS in Azure Upload Files to Ethereum Blockchain with React JS · #3 IPFS Image Storage DApp Tutorial Connecting ETH to IPFS DARFChain interplanetary version Odoo IPFS Ethereum

Ethereum.org on ENS & IPFS. The website is now available on IPFS and accessible via ENS at ethereum.eth. Here’s more information from the ENS team on how to access it. Special thanks to the IPFS team (particularly Chris Waring) who created a VuePress plugin for IPFS to make this implementation as easy as possible. Additional shoutouts to Brantly Millegan of the ENS team and Alex Van de Sande ... Today, we are excited to announce Cloudflare's Ethereum Gateway, where you can interact with the Ethereum network without installing any software on your computer.... Continue reading » Jonathan Hoyland. Continuing to Improve our IPFS Gateway. June 19, 2019 2:00PM Crypto Week Crypto Cryptography IPFS 1.1.1.1. When we launched our InterPlanetary File System (IPFS) gateway last year we were ... IPFS and Ethereum Smart Contracts. Although Ethereum protocol doesn’t provide any native way to connect to IPFS, we can fall back to off-chain solutions like Oraclize to remedy that. Oraclize allows for feeding smart contracts with all sorts of data. One of the available data sources is URL. We could use a public gateway to read from our JSON file on IPFS. Relying on a single gateway would ... Blog; Help; View more. IPFS Weekly 103 . by Jenn Turner on 2020-09-02 ... The folks at ChainShot are hosting another workshop for those keen to learn more about IPFS and Ethereum. Led by Dan Nolan, the workshop is a weekend of fundamentals, concepts, and components, plus learning how to build core data structures. Register now. Quote of the week . My computer is officially part of the ... Anleitungen, Ressourcen und Werkzeuge für das Entwickeln von Ethereum-Anwendungen.

[index] [252] [6262] [6867] [4016] [62] [4630] [4902] [4681] [7740] [1108]

Ethereum ❤️ IPFS Day

🛠️ Connecting ETH to IPFS How can you connect Ethereum and IPFS? Austin Griffith (Ethereum Foundation) will be going through a quick getting started guide and example dapp for HackFS. This prototype was developed during the first Ethereum Hackaton at hack.ether.camp. MetaMask hosted events. Right now, we're gearing up for the Ethereum-IPFS Hackathon in Seattle: meetup.com/Seattle-IPFS-Meetup/events/236101964/ Interplanetary free-scalable version of DARFChain stores #Odoo #ERP data in to #IPFS, puts hashes of documents to #Ethereum smart contracts. https://github.c... This also cover about deloyment of smart contract to Ethereum testnet without downloading whole blockchain to your system using Infura. Infura Website https://infura.io

#