Bitcoin - WikiMili, The Best Wikipedia Reader

Best $100-$300 FPGA development board in 2018?

Hello, I’ve been trying to decide on a FPGA development board, and have only been able to find posts and Reddit threads from 4-5 years ago. So I wanted to start a new thread and ask about the best “mid-range” FGPA development board in 2018. (Price range $100-$300.)
I started with this Quora answer about FPGA boards, from 2013. The Altera DE1 sounded good. Then I looked through the Terasic DE boards.
Then I found this Reddit thread from 2014, asking about the DE1-SoC vs the Cyclone V GX Starter Kit: https://www.reddit.com/FPGA/comments/1xsk6w/cyclone_v_gx_starter_kit_vs_de1soc_board/‬ (I was also leaning towards the DE1-SoC.)
Anyway, I thought I better ask here, because there are probably some new things to be aware of in 2018.
I’m completely new to FPGAs and VHDL, but I have experience with electronics/microcontrollers/programming. My goal is to start with some basic soft-core processors. I want to get some C / Rust programs compiling and running on my own CPU designs. I also want to play around with different instruction sets, and maybe start experimenting with asynchronous circuits (e.g. clock-less CPUs)
Also I don’t know if this is possible, but I’d like to experiment with ternary computing, or work with analog signals instead of purely digital logic. EDIT: I just realized that you would call those FPAAs, i.e. “analog” instead of “gate”. Would be cool if there was a dev board that also had an FPAA, but no problem if not.
EDIT 2: I also realized why "analog signals on an FPGA" doesn't make any sense, because of how LUTs work. They emulate boolean logic with a lookup table, and the table can only store 0s and 1s. So there's no way to emulate a transistor in an intermediate state. I'll just have play around with some transistors on a breadboard.
UPDATE: I've put together a table with some of the best options:
Board Maker Chip LUTs Price SoC? Features
icoBoard Lattice iCE40-HX8K 7,680 $100 Sort of A very simple FPGA development board that plugs into a Raspberry Pi, so you have a "backup" hard-core CPU that can control networking, etc. Supports a huge range of pmod accessories. You can write a program/circuit so that the Raspberry Pi CPU and the FPGA work together, similar to a SoC. Proprietary bitstream is fully reverse engineered and supported by Project IceStorm, and there is an open-source toolchain that can compile your hardware design to bitstream. Has everything you need to start experimenting with FPGAs.
iCE40-HX8K Breakout Board Lattice iCE40-HX8K-CT256 7,680 $49 No 8 LEDs, 8 switches. Very similar to icoBoard, but no Raspberry Pi or pmod accessories.
iCE40 UltraPlus Lattice iCE40 UltraPlus FPGA 5280 $99 No Chip specs. 4 switchable FPGAs, and a rechargeable battery. Bluetooth module, LCD Display (240 x 240 RGB), RGB LED, microphones, audio output, compass, pressure, gyro, accelerometer.
Go Board Lattice ICE40 HX1K FPGA 1280 $65 No 4 LEDs, 4 buttons, Dual 7-Segment LED Display, VGA, 25 MHz on-board clock, 1 Mb Flash.
snickerdoodle Xilinx Zynq 7010 28K $95 Yes Xilinx Zynq 7-Series SoC - ARM Cortex-A9 processor, and Artix-7 FPGA. 125 IO pins. 1GB DDR2 RAM. Texas Instruments WiLink 8 wireless module for 802.11n Wi-Fi and Bluetooth 4.1. No LEDs or buttons, but easy to wire up your own on a breadboard. If you want to use a baseboard, you'll need a snickerdoodle black ($195) with the pins in the "down" orientation. (E.g. The "breakyBreaky breakout board" ($49) or piSmasher SBC ($195)). The snickerdoodle one only comes with pins in the "up" orientation and doesn't support any baseboards. But you can still plug the jumpers into the pins and wire up things on a breadboard.
numato Mimas A7 Xilinx Artix 7 52K $149 No 2Gb DDR3 RAM. Gigabit Ethernet. HDMI IN/OUT. 100MHz LVDS oscillator. 80 IOs. 7-segment display, LEDs, buttons. (Found in this Reddit thread.)
Ultra96 Xilinx Zynq UltraScale+ ZU3EG 154K $249 Yes Has one of the latest Xilinx SoCs. 2 GB (512M x32) LPDDR4 Memory. Wi-Fi / Bluetooth. Mini DisplayPort. 1x USB 3.0 type Micro-B, 2x USB 3.0 Type A. Audio I/O. Four user-controllable LEDs. No buttons and limited LEDs, but easy to wire up your own on a breadboard
Nexys A7-100T Xilinx Artix 7 15,850 $265 No . 128MiB DDR2 RAM. Ethernet port, PWM audio output, accelerometer, PDM microphone, microphone, etc. 16 switches, 16 LEDs. 7 segment displays. USB HID Host for mice, keyboards and memory sticks.
Zybo Z7-10 Xilinx Zynq 7010 17,600 $199 Yes Xilinx Zynq 7000 SoC (ARM Cortex-A9, 7-series FPGA.) 1 GB DDR3 RAM. A few switches, push buttons, and LEDs. USB and Ethernet. Audio in/out ports. HDMI source + sink with CEC. 8 Total Processor I/O, 40 Total FPGA I/O. Also a faster version for $299 (Zybo Z7-20).
Arty A7 Xilinx Artix 7 15K $119 No 256MB DDR3L. 10/100 Mbps Ethernet. A few switches, buttons, LEDs.
DE10-Standard (specs) Altera Cyclone V 110K $350 Yes Dual-core Cortex-A9 processor. Lots of buttons, LEDs, and other peripherals.
DE10-Nano Altera Cyclone V 110K $130 Yes Same as DE10-Standard, but not as many peripherals, buttons, LEDs, etc.

Winner:

icoBoard ($100). (Buy it here.)
The icoBoard plugs into a Raspberry Pi, so it's similar to having a SoC. The iCE40-HX8K chip comes with 7,680 LUTs (logic elements.) This means that after you learn the basics and create some simple circuits, you'll also have enough logic elements to run the VexRiscv soft-core CPU (the lightweight Murax SoC.)
The icoBoard also supports a huge range of pluggable pmod accessories:
You can pick whatever peripherals you're interested in, and buy some more in the future.
Every FPGA vendor keeps their bitstream format secret. (Here's a Hacker News discussion about it.) The iCE40-HX8K bitstream has been fully reverse engineered by Project IceStorm, and there is an open-source set of tools that can compile Verilog to iCE40 bitstream.
This means that you have the freedom to do some crazy experiments, like:
You don't really have the same freedom to explore these things with Xilinx or Altera FPGAs. (Especially asynchronous circuits.)

Links:

Second Place:

iCE40-HX8K Breakout Board ($49)

Third Place:

numato Mimas A7 ($149).
An excellent development board with a Xilinx Artix 7 FPGA, so you can play with a bigger / faster FPGA and run a full RISC-V soft-core with all the options enabled, and a much higher clock speed. (The iCE40 FPGAs are a bit slow and small.)
Note: I've changed my mind several times as I learned new things. Here's some of my previous thoughts.

What did I buy?

I ordered a iCE40-HX8K Breakout Board to try out the IceStorm open source tooling. (I would have ordered an icoBoard if I had found it earlier.) I also bought a numato Mimas A7 so that I could experiment with the Artix 7 FPGA and Xilinx software (Vivado Design Suite.)

Questions

What can I do with an FPGA? / How many LUTs do I need?

submitted by ndbroadbent to FPGA [link] [comments]

Anyone bullish on XLNX?

There's a pretty interesting debate in the AI space right now on whether FPGAs or ASICs are the way to go for hardware-accelerated AI in production. To summarize, it's more about how to operationalize AI - how to use already trained models with millions of parameters to get real-time predictions, like in video analysis or complex time series models based on deep neural networks. Training those AI models still seems to favor GPUs for now.
Google seem to be betting big on ASICs with their TPU. On the other hand, Microsoft and Amazon seem to favor FPGAs. In fact Microsoft have recently partnered with Xilinx to add FPGA co-processors on half of their servers (they were previously only using Intel's Altera).
The FPGA is the more flexible piece of hardware but it is less efficient than an ASIC, and have been notoriously hard to program against (though things are improving). There's also a nice article out there summarizing the classical FPGA conundrum: they're great for designing and prototyping but as soon as your architecture stabilizes and you're looking to ramp up production, taking the time to do an ASIC will more often be the better investment.
So the question (for me) is where AI inference will be in that regard. I'm sure Google's projects are large scale enough that an ASIC makes sense, but not everyone is Google. And there is so much research being done in the AI space right now and everyone's putting out so many promising new ideas that being more flexible might carry an advantage. Google have already put out three versions of their TPUs in the space of two years
Which brings me back to Xilinx. They have a promising platform for AI acceleration both in the datacenter and embedded devices which was launched two months ago. If it catches on it's gonna give them a nice boost for the next couple of years. If it doesn't, they still have traditional Industrial, Aerospace & Defense workloads to fall back on...
Another wrinkle is their SoCs are being used in crypto mining ASICs like Antminer, so you never know how that demand is gonna go. As the value of BTC continues to sink there is constant demand for more efficient mining hardware, and I do think cryptocurrencies are here to stay. While NVDA has fallen off a cliff recently due to excess GPU inventory, XLNX has kept steady.

XLNX TTM P/E is 28.98
Semiconductors - Programmable Logic industry's TTM P/E is 26.48

Thoughts?
submitted by neaorin to StockMarket [link] [comments]

FAQ#1: Can I mine Bitcoins using this old thing I found in my closet?

If you're wondering whether or not it is feasible to mine Bitcoin using your CPU, GPU (video card), cell phone, ps3, xbox, or original 1985 NES.
The true answer to that question if you are being pedantic is: Yes, you can!
The time required to get setup initially and the electrical efficiency of the hardware in question usually deduces the same answer from economically minded people, however. Which is: No!
Bitcoin mining is a highly competitive technological arms race, and you should never bring a knife to a gun fight.
The only two types of Bitcoin miners you want to be looking at right now are:
Feel free to kindly link any newbies asking a similar question to this post and downvote their original one, you can find this post stickied in the sidebar.
submitted by HighBeamHater to BitcoinMining [link] [comments]

I am going to design an Async ASIC chip for a class project.

Background:
I am enrolled in a VLSI course at a university. At the end of this class we will be manufacturing and testing our very own IC design.
Part of the class is a 5 week, four person team project that is entirely open ended. We have coordinated with a professor who specializes in asynchronous design, and we are planning on making an asynchronous ASIC bitcoin miner. I do not expect to be competitive in the bitcoin mining feild, we are designing this chip as a class project to demonstrate our ability in VLSI design. This is a 600nm process, which is seriously like 1990's technology. This will not be a particularly fast chip, but it might be a little bit faster than a standard CPU if and only if we can implement the SHA algorithm in hardware well enough.
Question:
I know relatively little about bitcoin, I have been subbed here for about three weeks and I have read the wikipedia page on Bitcoin and on the SHA-2 algorithm. I am looking for website resources that explain hardware implementations of SHA-256. I was wondering if anybody could point me towards helpful pages. Additionally, I was wondering if you could provide any other good website resources as starting points for bitcoin mining information, including specific information about the bitcoin mining protocol. When I search I find a lot of forum debates, but no real information. Are there any websites that explain this sort of thing that are common in the bitcoin community? Should I be turning to research papers at my university library to find this?
Specifically, I know we need to implement SHA-256 as part of the bitcoin algorithm. In an 'ASIC' rig, does the SHA-256 hardware exist as a co-processor to a general purpose CPU, or does the ASIC chip directly implement the entire algorithm. Basically, I am completely unaware of the cpu architecture required to implement bitcoin mining, yet that is what I have set out to do, because general purpose CPU's are easy...(ish)... If it is simply a matter of understanding SHA-256 algorithm and implementing my own architecture that is okay but I would rather not reinvent the wheel if I do not have to. An open source verilog FPGA implementation would be a great starting point.
Thanks for your pointers.
submitted by reParaoh to BitcoinMining [link] [comments]

Cryptography and the future of single-crypto coins verse multi-crypto coins

BitCoin, currently the world's largest decentralized digital currency, has made headlines throughout the world. With market values of over $1,000 per coin, it has become a valuable commodity to invest in. Many individuals, though, do not understand basic underpinnings of Bitcoin, and all altcoins (a term used to describe other digital currencies) as it applies to cryptography.
BitCoin is based on a cryptographic hash function called SHA-256, which is a subset of SHA-2. SHA-2 was created by the US National Security Agency (NSA) in 2001.[1] Currently, it is one of the most secure and most widely used cryptographic functions in the world.
While SHA-2 has proven it's strength over the years, it's not without it's weaknesses. Like most cryptoalgorithms, it is susceptible to birthday attacks, collisions, and man-in-the-middle attacks. While SHA-2 is still sufficient despite these possible weaknesses, it is difficult to say what the future holds for SHA-2. If the basic concept of a currency can be seen to be undermined at any point in the future, how can said currency maintain long-term value?
Enter Quark (QRK). Quark is a distributed, non-centralized currency much like BitCoin, but several key differences. This article will focus solely on the cryptographic changes. Quark employes not one cryptographic function, but a combination of six functions: blake, bmw, grøstl, jh, keccak, skein. These are the six finalists of the NIST hash function competition which ended in October 2012. [2] While I won't go into specific detail on each cryptographic algorithm here, I'll explain why these multiple algorithms help Quark in the long-run.
As previously mentioned BitCoin (and other digital currencies) are based on one cryptographic algorithm, leaving them open to possible attacks in the future. While BitCoin users have discussed the possibility of shifting cryptos in the future [3], it is not a guaranteed option, and could cause instability with the BitCoin value. With the market booming with ASIC and FPGA mining hardware, there are more opportunities for individuals to turn their hardware power into attacking SHA-2 mechanisms to simply destroy the BitCoin value.
With multi-layer multi-hash algorithms such as the ones Quark employes, it makes it extremely difficult to break down the entire structure. Even if a weakness is found in one – or even more – of the cryptographic algorithms that Quark employes, it doesn't destroy the entire structure, due to the “avalanche” affect of cryptoalgorithms.
Multi-tiered cryptoalgorithms provide the optimal basis for a currency to stand the test of time. Not only does it provide much more durability and security than single-hash functions, it extremely limits the abilities of ASIC miners to disrupt the mining market, making mining viable for “entry-level” and “mid-level” miners to still make it worth their time and computing power.
Resources: [1] http://en.wikipedia.org/wiki/SHA-2 [2] http://en.wikipedia.org/wiki/NIST_hash_function_competition [3] https://bitcointalk.org/index.php?topic=191.msg1585#msg1585
EDIT: 11:31 CST, 12/16/13 ... Fixed known typos.
submitted by mechman991 to QuarkCoin [link] [comments]

An opinion on how Bitcoin is good for gaming [xpost r/truegaming]

A few days ago, I was a bit bored (read: procrastinating), and I ended up reading about Bitcoin. For those of you unfamiliar with the topic, the gist is this:
Bitcoin is a decentralized currency based entirely in cyberspace which is derived based on cryptographic hashes. The value is based on whatever value the market as a collective assigns to it, much like most modern currencies (USD, Euro, etc.). Transactions are enacted entirely over p2p systems, so it has useful applications ranging from legal online marketplace deals, to gray-area deals such as Wikileaks donations, to illicit activities on Silk Road, much like real world cash. New bitcoin units are 'mined' by using a computer to crack the encrypted provided by the p2p system.
Summarized from wikipedia and the official site
What applies to gaming is the mining of new coins. Originally users would let their cpu mine out the encryption blocks, but as the difficulty of mining increased, the users had to pool resources and work together. Eventually, someone realized that more efficient mining could be done using a computer's gpu as opposed to the cpu. The reason this was a big step has to do with the gpu's ability to much more effectively crunch the numbers of the encrypted block (for more information read the informative wiki page on it). As more users switched to this method, the difficulty increased to balance the new abilities. Avarice finds a way though, the more devoted users are employing devoted 'mining rigs' to increase profits. Some entrepreneurial souls have started producing and marketing advanced rigs for extreme mining (e.g. ztex and butterflylabs ).
This may not be good for the Bitcoin economy as a whole though, because it generates an upper echelon of powerusers who can afford the economic arms-race, and it may not be ultimately sustainable. BUT, this would be good for gaming because if the arms race continues we could see accelerated the development of gpu technology. This is all conjecture of course, but in my opinion, a Bitcoin bubble may be good for the community as a whole.
Conflict of Interest note: I am not affiliated with Bitcoin and do not personally invest in this currency.
Comment from thread:
You wrongly assume that bitcoins are even a factor in pushing hardware manufacturers to improve. It's not even a drop in the bucket compared to all other more relevant types of software that make use of graphics cards. -FromMars
Reply:
My view was that this niche allows smaller companies to come in and compete with the larger manufacturers (AMD/Nvidia) and bring new ideas to the table for efficient processing. I concede that high end applications will probably drive the gpu industry, but I posit that there is a chance for outside development ideas. -wessubba
submitted by wessubba to hardware [link] [comments]

An opinion on how Bitcoin is good for gaming.

A few days ago, I was a bit bored (read: procrastinating), and I ended up reading about Bitcoin. For those of you unfamiliar with the topic, the gist is this:
Bitcoin is a decentralized currency based entirely in cyberspace which is derived based on cryptographic hashes. The value is based on whatever value the market as a collective assigns to it, much like most modern currencies (USD, Euro, etc.). Transactions are enacted entirely over p2p systems, so it has useful applications ranging from legal online marketplace deals, to gray-area deals such as Wikileaks donations, to illicit activities on Silk Road, much like real world cash. New bitcoin units are 'mined' by using a computer to crack the encrypted provided by the p2p system.
Summarized from wikipedia and the official site
What applies to gaming is the mining of new coins. Originally users would let their cpu mine out the encryption blocks, but as the difficulty of mining increased, the users had to pool resources and work together. Eventually, someone realized that more efficient mining could be done using a computer's gpu as opposed to the cpu. The reason this was a big step has to do with the gpu's ability to much more effectively crunch the numbers of the encrypted block (for more information read the informative wiki page on it). As more users switched to this method, the difficulty increased to balance the new abilities. Avarice finds a way though, the more devoted users are employing devoted 'mining rigs' to increase profits. Some entrepreneurial souls have started producing and marketing advanced rigs for extreme mining (e.g. ztex and butterflylabs ).
This may not be good for the Bitcoin economy as a whole though, because it generates an upper echelon of powerusers who can afford the economic arms-race, and it may not be ultimately sustainable. BUT, this would be good for gaming because if the arms race continues we could see accelerated the development of gpu technology. This is all conjecture of course, but in my opinion, a Bitcoin bubble may be good for the community as a whole.
Conflict of Interest note: I am not affiliated with Bitcoin and do not personally invest in this currency.
submitted by wessubba to truegaming [link] [comments]

BitCoin Mining FPGA Card I Built a Crypto Mining Farm in My Garage  How To Setup a ... Bitcoin Mining Hardware - CPUs, GPUs, FPGAs and ASICs New Generation Bircoin Asic Miners How To Start [ASIC Bitcoin Mining]

ASIC for bitcoin mining hardware . Mining is now so competitive, and the difficulty rate so high, that attempting to do so without an ASIC is unprofitable. Because ASIC mining hardware is so expensive, ASIC for bitcoin mining is done by companies in thermally-regulated data-centers with access to low-cost electricity. Many of these companies lease part of their mining power as a service. Cloud ... Later amateurs mined bitcoins with specialized FPGA and ASIC chips. The chips pictured have become obsolete due to increasing difficulty. Today, bitcoin mining companies dedicate facilities to housing and operating large amounts of high-performance mining hardware. [99] Semi-log plot of relative mining difficulty. Mining is a record-keeping service done through the use of computer processing ... Further, the design and manufacture of any 10nm Bitcoin mining hardware is likely to take at least a year. Get a Wallet. You can't mine bitcoins with an Antminer S9 without a wallet. Our guide on the best bitcoin wallets will help you get one fast and for free. Read it here! Power Supply . The good news is that existing power supplies, at least those of sufficient wattage, are fully compatible ... See Mining Hardware Comparison for FPGA hardware specifications and statistics. ASIC Mining. An application-specific integrated circuit, or ASIC, is a microchip designed and manufactured for a very specific purpose. ASICs designed for Bitcoin mining were first released in 2013. For the amount of power they consume, they are vastly faster than all previous technologies and already have made GPU ... From Bitcoin Wiki. Jump to: navigation, search. A field-programmable gate array (FPGA) is an integrated circuit designed to be configured by the customer or designer after manufacturing—hence "field-programmable". The FPGA configuration is generally specified using a hardware description language (HDL), similar to that used for an application-specific integrated circuit (ASIC) (circuit ...

[index] [9924] [43535] [46820] [25157] [8818] [18752] [24720] [34280] [28479] [8300]

BitCoin Mining FPGA Card

50+ videos Play all Mix - Bitcoin Mining Hardware - CPUs, GPUs, FPGAs and ASICs YouTube Tips for Small Bitcoin Miners to Maximize Their Advantages - Duration: 26:55. Block Operations 93,508 views FPGA miners used much less power than CPU's or GPU's and made concentrated mining farms possible for the first time. Today's modern and best bitcoin mining hardware Will FPGA cards replace GPU cards for cryptocurrency mining? Let's review the best hardware for FPGA mining, mining profitability, and our new FPGA mining ri... I set up a Bitcoin cryptocurrency mining farm in my garage with GPU mining rigs and ASIC miners as well as FPGA miners in my house. I explain how to setup yo... I want to give a shoutout to NotSoFast for tweeting about the FPGA's and also want to mention that my preorder is not like the preorder days of 2014 (i.e. Ti...

#