hckrnws
I think it's a bit rich to describe this as the 'future of video game preservation'.
The MiSTer project https://github.com/MiSTer-devel/Wiki_MiSTer/wiki more rightfully deserves that title. It's got a huge range of systems (across consoles, arcade and micro computers) and it's all GPL licenced. The base board is a Terasic DE10 Nano which is proprietary but all other hardware required is open source.
The MiSTeX project aims to make MiSTer portable across different FPGA platforms https://github.com/MiSTeX-devel so a DE10 Nano won't be mandatory enabling a new ecosystem of open hardware and commercial for profit solutions.
I take no issue with people wanting to make money in this space. I take great issue with trying to gatekeep system preservation behind a mostly closed system you stamp an 'open' moniker on.
Absolutely 0 mentions of the word "MiSTer" negatively impacts this "open" project.
Their self-appraising pitch applies more to the MiSTeX project than themselves.
Yeah but they seem to be suggesting no one has done fpga emulation before.
(I'm biased, I love my mister!)
After DE-10 Nanos what up in price (likely due to MiSTer), there was rumblings that it was too tied to the hardware to be portable.
I missed out when things were cheap and always assumed buying a board at the higher prices would be the trigger which caused the project to move to the next generation of hardware.
Yeah I paid the higher price unfortunately. The best thing about it is I don't spend all my time fiddling with settings like I do with software emulators.
Which is just rich, as Analogue takes advantage of preorders to limit hardware on a per-order basis, meaning they are always out of stock. Don't have this issue with the MiSTer.
Five years ago the Mister people were similarly unable to STFU about raspberry pi’s. Mister is still emulation, it still has tons of issues, and it’s still $700 invested before you’re actually playing games. Meanwhile we’re in a golden age of RGB mods, flash carts, and optical drive emulators for original hardware.
While I'm a strong supporter of your position that the MiSTer (and any FPGA console implementation) is still emulation, it's worthwhile to keep in mind that this will indeed be _a_ future for preservation (unsure about it being the only, and warmer to, but not sold on it being the best way).
Original hardware is great, but it's getting older, and failing. The Super Nintendo / Super Famicom, for instance, uses a ceramic oscillator as a clock source for its sound processing unit, and as the console is designed, the sound clock is essential for keeping the entire system in sync. Members of the Tool Assisted Speedrunning community have been experimenting with parts replacements to resuscitate this clock source with only mixed successes. This is one of many pieces of silicon on these old platforms that will continue to fail as time goes on.
We can't make brand new classic consoles because the parts are just unavailable, and the industry has a well trodden path of reimplementing discrete electronics in FPGA for decades now with great success. If your goal is to keep playing games for a while longer on your OG console, then heck yeah, go for it. It does indeed do the thing. But if we want to be able to preserve this history in as close to the original form factor and play environment as possible, then we have to explore other options, and FPGA system cloning is a grand way to do this.
I think the most worrying thing about emulating old consoles by reverse-engineering them is that you're running out of time to test it. Say for example that the original Luigi looks blue in the hardware it was made for, but in an emulator he looks green. If there are no original NES left, you will never be able to know the emulator's code doesn't match how the NES actually worked. From that point on, the emulator experience is as canonical as it gets.
Different NES revisions actually output different colors. Plus, originally it's an RF signal, then composite. It was displayed on some CRT. It's possible to RGB mod an NES, but is that authentic? No.
The one true color of Luigi is the one that you saw as a child on a sunday afternoon in the eighties. Lost in time, like tears in rain.
My MiSTer FPGA was about $200 before I was playing games. $700 is either hyperbole or have DE10 Nano board prices really gone up THAT much?
After DE10 Nano, you need just a RAM module and a I/O board and you're ready to go.
The MSRP of the de10 nano is $225, but they are commonly out of stock still. The going price for a scalped de10 nano is over $300, so all is you are likely looking at $400 to $500. I kick myself often for not getting multiple de10 nanos earlier (there are more projects that used them than just mister).
Just tried from Terasic website. Tried to order two for $225 each, they seemed to be available.
That said, the price has almost doubled in a few years...
"FPGA is emulation"
You can have any number of issues with Mister, but this is just flat out incorrect if you know the definition of either of those words.
They may have read this article written by Near arguing that FPGA-based products are still (hardware-based) emulation: https://archive.is/fWosI
It's a well-written article, and presents a balanced view. Its stance is against the idea that FPGA-based emulation is always better than CPU-based emulation.
There's no circuit-level reproduction happening anywhere yet, it's basically emulation with a hardware clock. Analogue and kevtris know that FPGA isn't magic. Rather than contribute to the knowledge pool, like the MiSTer project, they're being bad actors and trying to muddle the situation to make money off of it. Near was trying to clarify the situation, but now we see where good intentions gets you.
That is a fantastic read. Thanks.
Of course Mister uses emulation - there are only few chips in older arcade boards that truly can be simulated in FPGA with circuit-accuracy.
The Mister SNES core has less accuracy like the software emulator bsnes, not all game work or there a bugs - they still "emulate" the hardware how they interpret it.
FPGA is just a different way of "hardware emulation" instead of traditional "software emulators".
The only real advantage of FPGA compared to software is the improved latency.
An FPGA can be 100% hardware perfect with practically no extra processing cost. The potential is there, even if it's not initially exploited.
CPU based emulation will require a fairly powerful CPU just to emulate SNES. It's pretty expensive to emulate hardware in 180-190 ns increments.
This isn't quite true - you lose some of the weird analog stuff that may have happened. On a well-designed board, though, you can get essentially perfect cycle-accurate reproduction.
Not only, the ability to replicate hardware allow to render some effects that can’t be render with software emulation. I’ll try to find the video I saw on YT explaining that point.
Perhaps you are referring to this video [0] by What's Ken Making which goes into detail about the advantages/disadvantages of both software and hardware emulation?
At 13:43, he demonstrates a couple of snes games with effects that don't work well in software emulation (along with what the effects / game mechanics should be on original hardware). At 22:25 he shows these same games running on FPGAs (analogue pocket, mister) and behaving correctly, just like on original hardware.
As an aside, if this isn't the one you were referring to, it's still an awesome video by an amazing creator; I'm a big fan of this stuff after discovering it on accident. He gives a nice introduction to FPGAs in this same video.
The argument being made is that the distinction between an FPGA and CPU is irrelevant to determine whether something is emulation.
An emulator accelerated by an FPGA is still an emulator and a CPU simulating a console at the circuit level is no different in accuracy than an FPGA doing the same thing.
The main difference is the experience, where the FPGA can offer an experience closer to the original than a CPU running an operating system can, especially for older consoles. Not because of a difference in emulation accuracy, but because the user experience is different.
That’s just building a unit to spec using an fpga instead of a chip fab. If it’s a 1-1 copy, emulation would not appear to be the correct term.
If I have a black box chip that takes in 3 bits and spits out 3 based on the input. If I make a bit perfect implement of it, but my chip has additional outputs that correspond to unused input, hence never used. Is it still considered emulation? It differs minimally in a negligible manner.
> if you know the definition of either of those words
Let's put that to the test, ok
https://www.merriam-webster.com/dictionary/emulate
1a: "to imitate by means of an emulator"
https://www.merriam-webster.com/dictionary/emulator
2: "hardware or software that permits programs written for one computer to be run on another computer"
Yup, FPGA recreating other hardware definitely fits the bill of emulation. Granted, Analogue has played up a marketing spin that boxes in "emulation" to mean "software running on some generic CPU" in order to claim "no emulation", but it's just marketing doing the work of marketing.
WINE fits that definition but is generally considered to be Not Emulation in a technical (if not colloquial) sense.
It depends on your interpretation of the definition. If the two "computers" that the webster definition is referring to have a different architecture, then Wine does not fit that definition, because Wine relies on the fact that it's the same computer architecture, namely an x86 PC.
If you combine Wine with Rosetta to run Windows programs on an M1 Mac, then it is definitely emulation by any definition. But then it's not Wine that's doing the emulation, it's Rosetta.
> Yup, FPGA recreating other hardware definitely fits the bill of emulation.
It really doesn't. The FPGA actually implements the target hardware, rather than using other hardware to produce equivalent behavior.
That's not to say that no emulation is involved, as not all of the functionality is actually achieved through the FPGA, but where it is, it's perfectly valid to distinguish it from emulation.
It doesn't implement the original hardware. It's not copy of the original chip. Reverse engineering is used to get the behavior of the hardware. They reproduce the inputs and outputs with Verilog and write it to an FPGA.
If it weren't emulation, they would decap the original chip and directly copy the chip, gate-by-gate. It could be done, but it hasn't been done that way anywhere to date.
At the end of the day both FPGAs and software emulators are Turing machines that produce a set of outputs given a set of inputs : any logic an FPGA can implement can also be implemented in software, it's computer science 101
FPGAs aren't magically more accurate. That is only up to the programmer and what effort they put in.
The main difference is efficiency and parallelism : it's much easier to reliably produce cycle-accurate parallel outputs in real time with an FPGA, compared to software running on a multi-tasking OS with many layers of abstraction and no deterministic real-time guarantees.
But, as a single-core processor can fake multitasking, by slicing time between processes (preemptive multitasking), software emulation can mimic parallelism if the host is beefy enough compared to the old school system it's emulating. The larger the performance / clock speed gap between the host and target, the more indistinguishable from a truly parallel FPGA an emulator can be.
Software emulation also has practical advantages for developers : while FPGAs force you to painstakingly implement every bit of functionality at the logic gate level, with software you can start off with a much higher level model of the target system that's much easier to implement, and mix & match that with more precise low level simulation where it matters. The time this frees up (+ the availability of various libraries) allows the developer to spend more time researching the original system and adding modern quality of life features that just wouldn't be possible otherwise.
Great article on the topic : https://archive.is/fWosI
Nobody has put these chips under a scanning electron microscope, catalogued each gate, and then recreated those in extremely verbose Verilog.
That's what "implementing the target hardware" in an FPGA would consist of.
Even if this extremely-arduous process were to take place, you would likely still be unable to reproduce the analog chips used for audio and video in most FPGAs - certainly not the cheap FPGAs MISTer and friends use.
> Nobody has put these chips under a scanning electron microscope, catalogued each gate, and then recreated those in extremely verbose Verilog.
I don't think anyone was doing that when iterating on the hardware back in the day, either. Dollars to donuts, the 65C02 was not based on a deep empirical analysis of the behavior of as-implemented 6502s, but was rather produced from modifications to the design spec that the original 6502 was itself implemented from, with the intent of maintaining compatibility with the original design.
Same thing here. An FPGA implementation of the original schematics of the hardware can be viewed as another instance of variant hardware built against the original design, whereas software emulation is simulating the outward behavior of the original hardware without any ability to be implemented directly from the original design at all.
> Even if this extremely-arduous process were to take place, you would likely still be unable to reproduce the analog chips used for audio and video in most FPGAs
Which returns us to the original description of these solutions being mostly hardware implementation via FPGA, but not entirely, as some emulation is still needed.
FPGAs are reimplementations, to me, not emulation. But I don’t begrudge anyone who wants to be a stickler on the definition of “emulation”.
Even the original hardware “emulates” a game if you want to get down to it.
That’s how I see it too. The FPGA version are knock offs, like would have been done back in the day but with fancy future hardware. A gate level reproduction would be kinda strange aside from pure desire for preservation. (Which is still a decent goal, just not of these projects)
So your argument is that its closer to virtualization, than to emulation?
Not really. And I'm not sure the concept of virtualization makes sense when applied to hardware in the first place.
The FPGA is actually implementing the hardware. It's like using a newly manufactured edition of the original.
The FPGA is used to implement an approximation of the original hardware’s functionality - not a perfect clone of the original hardware itself, which means they have their own unique bugs and inaccuracies compared to the original designs.
I think “emulation” fits as a reasonable description of this, personally - at least as something an average person will understand.
(2022) - I don't think there is anything new here since this was announced two years ago.
FWIW, OpenFPGA on the Analogue Pocket works pretty well. Many of the most popular MiSTer cores were ported over and there are some nice desktop tools to make it easy to configure [1].
Wow, that looks useful! Out of curiosity, I scanned the latest Windows release (4.5.0) with VirusTotal and it reported several malware hits. I realize I could manually audit the source code and build an .exe from scratch but do you think the release hosted on GitHub is malicious?
See https://www.virustotal.com/gui/file/7695a53b129c4e46eaf60cb0... for the VirusTotal report.
There's a note on the README about false positives from virus scanners on Windows, so maybe that is all it is. But I can't speak to it beyond that. I've only used the Mac OS version.
I wish people would stop treating FPGAs as the Second Coming of the Lord or whatever. It is really not.
It's emulation, plain and simple. Not bettero or worse than software emulators. It usually lags behind the pure software emulators because there are fewer devs and because emulating stuff in hardware is harder than emulating stuff in software.
Just because it's in hardware doesn't mean that it is "better" or "more accurate".
The main advantage of FPGA emulation is concurrency. When you’re emulating a piece of hardware with multiple chips in software, you’re often forced to run the emulation in batches (i.e. run the CPU for 10 cycles, then run the video chip for 10 cycles, then run the audio chip for 10 cycles) for performance reasons. This matters because some games require a higher level of timing accuracy to (for example) paper over bugs in the game code, or perform fancy graphical tricks. There are cycle-accurate software emulators, but they aren’t really playable for consoles after the 16-bit era and require relatively powerful CPUs. FPGAs allow you to run multiple chips in parallel, which eliminates this issue, allowing for accurate emulation in a small battery-powered handheld device.
For people interested in this topic: I converted my 8-bit emulators to a complete 'cycle-stepped model' (the last holdout were the CPU emulators), and wrote a couple of blog posts in the process:
From newest to oldest:
- Cycle-stepped Z80 emulator: https://floooh.github.io/2021/12/17/cycle-stepped-z80.html
- Z80 instruction timing details: https://floooh.github.io/2021/12/06/z80-instruction-timing.h...
- Cycle-stepped 6502 CPU emulator: https://floooh.github.io/2019/12/13/cycle-stepped-6502.html
Hey Floh! I ported your 6502 emu to rust for my AccurApple emulator. You saved me tons of time :-)
Thanks! Glad to hear it's useful :)
I guess it would be a great hacker project to build a PCIe card with an FPGA (or several) that can e.g. be used to accelerate emulators.
Or would a GPU be more fit for this purpose, and a more cost-effective and practical solution?
GPU are for data data parallel operations, which does not help for emulators which are (for the most part) running much slower traditional CPUs and a handful of support chips all at the same time, which is not data parallel.
Very few systems depend on this to the point of being an issue. The only one that this was a reasonable problem for was the PS2.
For pretty much all 8-bit home computers cycle-correct emulation is essential, without it most modern scene demos simply don't work, but also a lot of old games (although those are usually more forgiving).
In later computer architectures, hardware components have become more and more decoupled from each other, running on separate clocks and busses, with caches and buffers inbetween and what not, all of which makes timing less predictable but also less important in emulation, giving the emulation much more slack when it comes to "synchronicity" (which ironically makes modern computer systems "easier" to emulate than older systems - at least when it comes to correct timing).
But 8-bit home computers (and also the early 16-bit systems like the Amiga and Atari) were essentially a single 'mega-chip' all running off the same clock and all chip timings being deterministic, which was and is exploitet by software.
Yes but they're so slow that it's trivial to emulate them
EDIT: That's why I point out PS2. The PS2 was the first system that was fast enough while being dependent on timing enough that there were huge issues with its emulation
Near / Byuu wrote a lot about their experience with timing bugs on the SNES and how it takes a quite powerful CPU to emulate it cycle-accurately.
https://arstechnica.com/gaming/2011/08/accuracy-takes-power-...
Writing an Apple2e emulator, I wouldn't say it's trivial :-/ Proper video decoding is not exactly easy and neither is speaker emulation and neither is floppy disk emulation.
It's easy if you're looking at 99% accuracy. But if you aim at 99.9% it's a different story.
Plain and simple, there's no fully accurate Apple2e emulator so far and it's not like nobody has given a try in the last 20 years...
One practical difference is that software emulation almost always runs under an operating system and all sorts of cruft which can add latency and jitter. An FPGA just does the one thing.
there are enough baremetal emulators out there by now that show that you can definitely run emulators without any OS. BMC64, PiStorm/Emu68, just to name a few
Just because emulators often run under an OS doesn't mean that software emulation is inferior to FPGA emulation.
> Not bettero or worse than software emulators.
FPGAs and the associated dev tools allow for timing analysis/timing constraints that cannot be done with software based emulators.
"cannot" isn't accurate. "cannot easily" would be accurate.
For example, FPGAs themselves can be emulated in software with precise timing accuracy, so anything you can run on an FPGA can be emulated in software without the FPGA. It's difficult and sometimes infeasible with current CPUs to get up to the same speed though. This depends a lot on the FPGA circuit being emulated.
Source: I used to work on an FPGA-targetting hardware compiler and accelerated FPGA circuit simulators.
Fair enough. Nyquest and all that, with a sufficiently fast CPU anything is possible.
The main advantage is it makes it possible to do more accurate emulation without needing to sacrifice performance in the same way as software emulators. But the very large gap between affordable CPU performance and affordable FPGA performance makes it not an obvious tradeoff.
Whilst it's true FPGA doesn't inherently imply accuracy it does make it simpler to recreate things in a more accurate way, in particular around interactions between CPU,audio and video. It also enables accurate input latency with respect to these things.
Isn't it hardware emulation, built with software?
When you look at typical FGPA emulator source code, it often looks pretty close to software emulator code ported to VHDL/Verilog instead of an attempt to re-create the original reverse-engineered 'transistor-level design' which would automatically reproduce any 'undocumented behaviour' of the original chip (like http://www.visual6502.org/JSSim/index.html)
As such, an FGPA emulator isn't necessarily any closer to the original hardware behaviour than a software emulator. I guess the main advantage of FGPA is better performance on lower cost hardware.
> As such, an FGPA emulator isn't necessarily any closer to the original hardware behaviour than a software emulator. I guess the main advantage of FGPA is better performance on lower cost hardware.
The price of the majority of FPGA chips are really expensive compared even against cheap (yet still way more powerful) SoCs (like really cheap ARM CPUs that are used in sub $100 handheld emulators). Also, technically a FPGA based emulator could be more efficient compared emulating everything in software, but AFAIK even Analog Pocket is not really that better in battery life compared to say a Miyoo Mini + (maybe because a ARM SoC have better energy management, but I don't know).
I think really the main hype of FPGA is lower input latency, that is really difficult to archive with software emulation. There are still some tricks you can do in software that reduces the input latency significantly, but they generally are expensive to compute [1], so it wouldn't be feasible to be done in a cheap handheld device (at least yet).
[1]: https://arstechnica.com/gaming/2018/04/better-than-reality-n...
Many people think an FPGA is automatically accurate emulation because it's "hardware" and the original console is "hardware".
But the FPGAs are based on C software emulators because that's where all the knowledge in the world of how to emulate the original system is kept. You can't translate original hardware to Verilog and skip the figuring out how it works process.
There are a very few cores which are derived from die shots of the original chips - but in the majority of cases you're correct.
Most of the time die shots are used to extract data tables or firmware from ROMs rather than logic. But it's probably happened a few times.
Comment was deleted :(
You're getting into the weeds and will get different answers based on semantics, here.
A FPGA emulation isn't inherently better than a CPU-based emulation of any given chip; it's not more authentic because it still lacks the particular quirks of any old CPU that are associated with the way the hardware was laid out, path lengths, imperfections etc.
I'm glad it's introducing FPGA programming to a wider audience because FPGAs are probably going to become more important going forward - and are probably going to be what keeps Intel alive - but it doesn't make the emulator inherently better.
Intel is years behind Xilinx in tooling and high-end offerings and shows no signs of ever catching up.
> shows no signs of ever catching up
Personally I think that company's going, the only question is how long is it going to take...
I recently bought an Analogue Pocket and I think it's a great piece of hardware, but I'm really not a fan of this company's business model. This page has told me nothing I actually want to know about OpenFPGA.
Here's my question: If I'm developing an FPGA core, why should I develop for OpenFPGA instead of MiSTer? I want to know why it's better for preservation that I develop for OpenFPGA. Is it a more portable platform that has a more guaranteed future? I need to be convinced that OpenFPGA solves problems that make it a more likely choice than MiSTer 10+ years from now.
If someone with more experience than me in this space can answer the above, I would be really grateful. I'm astounded that Analogue's page on OpenFPGA is all marketing fluff without actually answering this.
Because it's open. Except there's nothing open about it.
It uses "open" in the name, to try and confuse people into doing unpaid work for them.
I am hopeful the overlap between the levels of naivety needed and ability to do digital design will be non-existent.
Even if this were the future of emulation, we need a lot more than emulators for anything to really be considered the "future of video game preservation".
Modern games so frequently require connections to servers - often needlessly. For example, Ubisoft is shutting down the servers for 2014's The Crew on March 31, 2024. That's less than 10 years after release. When the servers are shut down, the entire game will stop working - including the lengthy single player campaign. No emulator will bring that game back from the dead.
In a similar story, Square Enix recently announced that they will be shutting down Nier: Reincarnation's servers in April 2024, less than just 3 years after its worldwide release in 2021. And yes, it's a single player game. My wife is a fan of the Nier franchise and has been playing Nier: Reincarnation since it's release. In a couple months she'll never be able to play it again. Square also pulled the plug on Babylon's Fall in 2023, less than just 1 year after it's 2022 release.
If you're concerned about game preservation, be prepared for the disaster coming soon. FPGAs won't be enough.
While this is being done intentionally now, it has been an issue for over a decade. - And it hasn't kept people from preserving these games in the slightest.
Even now, there is a lot of work being put into reviving games that required master servers etc. by means of reverse engineering and essentially cracking. I do however agree with you that this is not the way it should I be and I still think developers and publishers should be legally bound to either allow software that they sell to function indefinitely or release the server code when they shut down the official servers.
I think "in the slightest" may be a bit of an exaggeration. Not every game with a required server connection has been resurrected. And as this problem continues to escalate, I'm doubtful that hackers and preservationists will be able to keep up.
I play Overwatch with my sons and not only can you not run a personal server, the game frequently has fundamental mechanic changes for certain characters (and all characters in this Tuesday’s update). The Overwatch of a year ago is a very different game than is now or will be a year from now.
If someone was to provide an Open Overwatch server, I don’t even know what it would look like at this point since game clients aren’t available for particular versions (maybe on PC, but not console). When Microsoft is done with Overwatch it’ll be gone.
I actually didn't count live service games like Overwatch in this category, to be honest.
Overwatch is deliberately designed as an ever evolving service instead of a product you purchase. With games like these, there is no way to ever archive them, since there is no canonical state in which the game remains for any sensible amount of time.
I understand that this is a cause of frustration and that there's a lot of issues with this but I'd keep those separate from one-time-purchase products which fortunately is still what games mostly are [...for now].
You're right, that was definitely an exaggeration on my end.
> Square Enix recently announced that they will be shutting down Nier: Reincarnation's servers in April 2024, less than just 3 years after its worldwide release in 2021. And yes, it's a single player game.
Hmm? The Nier games are always online? I keep meaning to try them (although by now i probably have to buy used discs). You're saying there's no point any more?
Nier: Reincarnation is a mobile game. If you're planning on playing Nier: Automata or Nier: Replicant, those are still available for the forseeable future.
And amazing. I normally don’t play games through multiple times in a row, but I played Replicant through Al endings and then finished the endings in Autamata immediately after.
Ah okay, I can't afford free to play games anyway.
So, what happens if this particular FPGA no longer can be bought?
Isn’t it more likely it will be possible to compile and run current C code on 22nd century hardware (possibly on some virtualization solution) than that it will be possible to compile and run FPGA code on 22nd century hardware?
> So, what happens if this particular FPGA no longer can be bought?
The same code can be easily target another FPGA as long as you don't use any vendor specific primitives or IP cores
I agree. For me, one of the challenging long term parts of video game preservation will be non-standard controllers and other peripherals, such as the Wii remote, Wii balance board, Guitar Hero/Rock Band guitars and drums, etc. Sure, technically you can use a mouse and keyboard for some of those, but it’s fundamentally an entirely different experience from the original.
The Wii came out over 17 years ago and third party companies still make controllers for them. Maybe that will still be the case 40 years after release, but eventually nearly all Wii consoles will stop working. Will there always be a big enough market of people playing Wii games on emulators to justify making those controllers and peripherals? I hope so, but am not sure. There near certainly won’t be enough demand for 3rd party Wii balance boards, which I can honestly live with, but I do hope the main controllers themselves are still available for purchase or possible to make out of other hardware available in the future.
Verilog is just a logic implementation. It should be pretty portable. There are some issues, and certainly the bitstreams won’t be compatible, but if C still works, I imagine verilog would too.
I agree, this is tying the video game to a more recent hardware, which will also disappear in its own time. Much better to have a full software emulation that can be ported / recompiled to newer CPUs and new OS.
So how is this different from MiSTer FPGA [1]?
I’ve pasted the specs below, but in my opinion the biggest difference is that the OpenFPGA - in its Analogue Pocket form - is an end-user friendly target. MiSTer is more “enthusiast-friendly” with more options and upgrades (including an recommended add-on to the basic kit).
MiSTer “tech specs”:
Intel/Altera Cyclone V SE (5CSEBA6U23I7) FPGA SoC with 110,000LE (41,500ALM) and 5,570Kbit of Block RAM.
ARM Cortex A9 dual-core CPU at 800MHz.
HDMI video and audio allowing easy connectivity to any modern monitor/TV.
1GB of DDR3 RAM that is directly available to both ARM and FPGA.
High-speed ARM <-> FPGA interconnect due to both being in the same chip.
OpenFPGA specs:
Intel/Altera Cyclone V FPGA 49K logic elements and 3.4Mbit BRAM
Intel/Altera Cyclone 10 15K logic elements
2x independently addressable 16MB cellular RAM (128Mbit x 16)
32MB low latency memory 1x synchronous DRAM 64MB (32Mbit x 16)
1x asynchronous SRAM 256KB (128Kbit x 16)
You can't carry a MiSTer in your Pocket, but you can Dock a Pocket.
It doesn't support a few power-hungry systems supported by MiSTer.
MiSTer doesn't support OG physical carts.
What about the actual FPGA specs? Is it more power efficient per LUT? A newer generation or the same generation but a smaller chip?
The Pocket has an FPGA from the same series as the MiSTer, but it's a lower-end model that's optimized more for power consumption. Most notably, it's missing the 800 MHz ARM core (meaning that it runs a custom "OS" off of a microcontroller rather than Linux like the MiSTer), and it has a bit less than half the amount of logic elements. This means that 16-bit console emulation is about the highest you can go on the Pocket, while the MiSTer can emulate the Playstation, Saturn, and N64.
So is this really "open"? Or as open as OpenAI?
The web site looks like your average ex-googler's SaaS startup so I'm sceptic.
Hoping people who are more in the emulation scene will clear things up.
The name is a bit misleading.
It also seems to re-use the name of a (from what I can tell) separate project:
Yeah, it makes no sense. It's like naming a TV remote 'OpenMicrocontroller'.
Comment was deleted :(
What exactly is "open" about this?
The cores that are community contributed can be open source. The hardware and tooling are not open, so it doesn't meet most people's definition of open. On the spectrum it's much closer to being open than closed source emulation products that first parties distribute.
Is MiSTer considered open? It's dependent on a closed source toolchain to generate the bytestream.
So it is only open in the sense of commercially (for profit) taking advantage of pre-existing open HDL written by third-parties.
Disgusting.
No. You need to explicitly request the tooling from the for-profit company (Analogue, which wraps Intel/Altera) so that the open source cores you write can run on their closed source hardware. It sounds exactly like the situation with MiSTer, except that there is a second for-profit middleman. Afaik MiSTer cores can't run on open source hardware or compile on open source toolchains.
It's really a matter of taste whether you like Intel more than Intel+Analogue. It's wrong to call MiSTer an open source project but openFPGA not just because the number of for-profit companies needed is different.
>Afaik MiSTer cores can't run on open source hardware or compile on open source toolchains.
It is on a core-per-core basis e.g. I managed to run some cores on ECP5 as well as GW1N, using open-source synth/route tools.
There's also some effort to make it FPGA family independent.
It is not hopeless; we will get there.
Yeah. Open toolchains have come a long way, but they’re not there yet. The Lattice ICE family are the best supported.
You're probably thinking iCE40.
Note ECP5 and Nexus support is similarly good by now, and these families have larger FPGAs better suitable for miSTer cores.
There's now also some support for GW1N/GW2N FPGA families from China. These have the advantage of being relatively quite cheap, and I hear they have FPGAs with 250k+ LBs coming, with RISC-V hard cores built in.
Wow that sounds cool. I'm playing around with the UP5K (pico ice) right now with apio and it's already the dream. With bigger FPGAs the sky would be the limit.
That's a huge amount of marketing effort with very little actual answers to basic questions such as "why is this the future of video game preservation?" Seems like a hype-based product
Why hype-based? Isn't an easy to use vintage experience device a legitimate value proposition within itself?
It's a value proposition in the sense that it already exists and people want it, but it's nothing new. Cheap handheld emulator devices have existed for literally decades.
The "hype-based" aspect is that they're pretending like it's something revolutionary, hence why they are leaning on this Teenage Engineering-esque marketing style.
This isn't really new (it came out a couple of years ago), but as far as I know it was the first handheld FPGA emulator available.
There's debate in this thread about whether FPGA emulation is better or more accurate. In my experience it is, especially in recreation of sound as I remember it from back when these games were new.
Being new is not mandatory for a business. Being valuable is.
I'll go to a restaurant in a bit and I can't care less about how new it is but, hey it's convenient!
I think this is not a great name for a product that isn’t developing silicon.
I am surprised no one has yet tried to make a portable shell for the DE-10 Nano (Mister-FPGA's target platform). With a power usage of 10 watts, I think it should be quite feasible even with the addition of a small LCD panel. However, it would probably require re-engineering some of the standard addon boards for the form factor.
They have https://youtu.be/8eAgjgbZyPw?si=ZIBEFLOc_13hitGp
Though sadly it's remained a prototype. I think the issue they had was you needed to mod the DE10 Nano (remove the pin headers and I think the ethernet jack) to fit it into the portable form factor. So they had concerns with either selling it in kit form where people had to do the mods themselves to expensive boards or doing it in house and effectively reselling modded DE10s.
A fully custom board would be the solution but then price is a big issue. The DE10 Nano is very cheap vs its BOM cost at catalogue prices.
I purchased the Analogue Pocket and it’s a great little device. Got an archive of all GB and GBC games and can play them off an SD card using some of the FPGA cores available. while expensive, it’s great to have that tactile feel just like an original gameboy with modern quality of life features
That's illegal isn't it? And it's Nintendo's intellectual property so you shouldn't admit it in public :)
Im sure he owns properly licensed copies of all those games. He’s just format shifting /s
I’m not even a real person so good luck catching me
Just a reminder that Analogue have made very little on their promises to those they stole time from.
Analogue isn't even the only ones to do an FPGA rebuild of a console; there's a near-perfect GB/GBC clone (with quirks toggles) that fits into a traditional Gameboy shell: https://funnyplaying.com/products/fpgbc-kit
>Just a reminder that Analogue have made very little on their promises to those they stole time from.
Please elaborate. These events are important to document.
The developer of mGBA wrote about it: https://endrift.com/2021/12/13/analogue-pocket-hate-story/
Some relevant excerpts:
> The next conversation unfortunately only brought more red flags. The first hint of impacting mGBA development had dropped: suddenly they were talking about delaying an mGBA release for a nebulous amount of time, directly contrary to what had been discussed prior. [...] The next conversation was suddenly about delaying until after the Pocket was released. At no point was such a thing discussed prior, but it was worded like it was. This was explained as putting a little bit of extra time after the release, though the reason was left implied; presumably they didn’t want mGBA stealing the Pocket’s thunder, as though that were at all a realistic scenario. And the amount of extra time proposed? Six months.
> By now it was clear to me that they didn’t respect me at all and there was no truth to the claim of the job not impacting my open source work. It all seemed to point to them seeing me as a source of cheap labor and then didn’t care at all how it impacted me, so long as I did the work for them. [It] really reflects on how little Analogue seems to actually care about the retro emulation community as a whole. In conversations with other emulator developers over the past it was spelled out that kevtris thinks of FPGA-based hardware emulation as inherently superior to software emulation, and is plenty willing to keep research he does towards the goal of perfecting his hardware solutions private, all while claiming that not only is it not even emulation (with an asterisk of course), it’s also the only route to perfect emulation. Neither of these claims is true.
> When I asked kevtris if he would release all of the documentation he had on GB/GBA he had said yes, after the Pocket shipped, but I’ve yet to see him release any of the documentation he’d promised for other projects, such as SNES, which have had products on the market for years now. The most I’ve seen is extremely basic overviews of a handful of obscure GBA behavior that, while valuable, is assuredly a tiny fraction of what he has.
The emulator writer scene lives on back-chatter. Analogue isn't even the only one... ask some developers what they think of RetroArch, who simply bundle up emulator cores... You won't get pretty answers.
Thank you.
It is more or less as bad as I imagined.
Lol, the audacity of a for profit company with a proprietary and closed source stack
I fail to see how programmable logic patched together == "the future of video game preservation". There's community, software, testing, etc... involved as well.
Comment was deleted :(
Does anyone konw the hardware changes in later versions lead to game crash rumor true or not?
Comment was deleted :(
Crafted by Rajat
Source Code