Banner
Views: 625,172,813
Time:
7 users online: bruninhopz, Keiko_, o Luigi-San, marathonx, MarkVD100, smwgohken, o Tokiko - Guests: 56 - Bots: 229Users: 34,562 (1,452 active)
Latest: Darius567
Tip: Only use a few different kinds of enemies per level. Overdoing it can make your level feel disjointed and clunky.Not logged in.
A Memory Map Thing (Project [redacted])
Forum Index - Important - Announcements - A Memory Map Thing (Project [redacted])
Pages: « 1 »

It's time to unveil another one of my projects - here's hoping this one isn't as controversial as the GIEPY standardization! This only really concerns people who do ASM, though it could turn out to be useful to average hackers, too.

A month or two ago, I started work on new memory maps which will hopefully fix a lot of the problems the current ones have.

Before anyone starts firebombing me, let me say that I've specifically designed these in a way that can pretty much completely emulate the current design.

I'm open to feedback, bug reports, and feature suggestions. After everyone's looked at these, I'll work with the admins to implement them into the website.

Anyway, before giving you the link, I'll list what they do differently and why I think it's better. Or, if you prefer to see the link now, I don't blame you. I'm not sure if I've explained some of these correctly, so please do check the actual memory maps before commenting.

  • Pagination. This is probably quite controversial, but it's easy to customize or entirely disable. With all of the other new features, dumping the whole map on a single page makes little sense and slows the whole thing down.
  • Contiguousness. (woohoo word formation) A real console doesn't really split the memory map into completely distinct pieces. It makes more sense to have one memory map for each game, and then let you filter out the regions you want. This lets you, for example, have both the ROM and the RAM map on the same page, or search through the whole memory map at once.
  • Filtering. The new system allows for very advanced filtering which allows you to look for specific addresses, ranges of addresses, types, and descriptions in specific regions. All quirks of the memory map are taken into account - searching for an SA-1 pack address (such as $3019) will give you the correct result, and so will searching for a mirrored address (such as $020019, which mirrors $7E0019). Description searching is also powerful and allows you to look for keywords, substrings, or even regular expressions.
  • Registers. Every game can now display its registers. Super Mario World has SNES, SA-1, and SuperFX registers; Super Mario All-Stars has only SNES registers; Super Mario 64 has the Nintendo 64 registers (oh goodness I barely understood any of that document). This means you have regs.txt (and much more) right in the memory map.
  • Details. Every address can now have additional information, such as diagrams, tables, or anything else. This means you no longer need to consult SMWiki for the valid values of an address, you can check them right there.
  • Fluidity. If JavaScript is enabled, you can navigate the whole thing without ever refreshing the page. Of course, those who wish to browse without JavaScript get a very usable fallback.
  • Backwards compatibility. As I mentioned above, the new memory maps can pretty much be used like the old ones. Additionally, the final implementation won't break the current memory maps, so you can always use those.

And that's about all I can think of. You can find the sample memory maps at this link. Currently, a lot of the data is incomplete (I've converted only a few addresses to use the new details system) or completely broken (Yoshi's Island pretty much has no addresses; Super Mario World is fully functional, however). It doesn't actually use the site's database, so I was pretty lazy with filling everything out. Of course, this won't be a problem when it's actually implemented.

That should about cover everything. As I said, feel free to leave feedback (if you hopelessly break them, please do tell me and don't let it slide).

cool beans

oam map pls (for completeness, not usefulness)
This could be quite useful. While I did not have much problems with the old maps, this is definitely an improvement.

Nevertheless, I don't like the pagination. I usually search for things with Ctrl-f (in the old ROM/RAM maps), but this system will search in the current page only.

A small nitpick would be that clicking on new pages is not instantaneous. The current ROM map takes 3.84s to load for me, whereas your site loads in 500ms for a page size of 25 and 800ms if I view everything. Thus this is a speed improvement. But I also measured that new pages load in about 200ms, thus after clicking two additional pages, I did not win anything compared to a full load in a single page.

What would be really nice is to have searching be instantaneous. I think you currently process the search requests on the server, so for each search I pay the 200ms + some additional cost since apparently the requests are not sent immediately to the server. Instead you could load all the data upfront and do the search completely in JavaScript. This should give near instant search results.

--------------------
Your layout has been removed.
Originally posted by Ladida
oam map pls (for completeness, not usefulness)

I'll look into it but I'm making no guarantees because the OAM is on bus B and my searching system can't handle bus B (I never considered it).

Originally posted by Horrowind
Nevertheless, I don't like the pagination. I usually search for things with Ctrl-f (in the old ROM/RAM maps), but this system will search in the current page only.

As I mentioned in the first post, the pagination can be turned off (look to the right of the pages; you can set how many items to show per page), so you get essentially the same searching system as with the current maps. If you use a filter, it'll search through absolutely anything.

Originally posted by Horrowind
A small nitpick would be that clicking on new pages is not instantaneous. The current ROM map takes 3.84s to load for me, whereas your site loads in 500ms for a page size of 25 and 800ms if I view everything. Thus this is a speed improvement. But I also measured that new pages load in about 200ms, thus after clicking two additional pages, I did not win anything compared to a full load in a single page.

Filtering and switching to pages can't really be done instantaneously since it does have to query the database, filter the results, and paginate them. The code which does all of that is as optimized as I could make it, so it really depends on the server. That being said, I'm not sure if my server is slower than Caffie, so the speed might degrade once it's moved to the website.

Originally posted by Horrowind
What would be really nice is to have searching be instantaneous. I think you currently process the search requests on the server, so for each search I pay the 200ms + some additional cost since apparently the requests are not sent immediately to the server. Instead you could load all the data upfront and do the search completely in JavaScript. This should give near instant search results.

Typing in the filtering fields has a 1000 ms debounce on it to prevent overloading the server with requests. If you switch to a page, click the Apply button, or press Enter, it'll be sent immediately.

While I did consider making searching client-side, it pretty much has to be server-side because of its complexity. Address filtering internally emulates the memory map (LoROM, SA-1, and Super FX all at once in the case of Super Mario World), so I'd pretty much have to send the whole database to achieve the same power. Address filtering also pretty heavily relies on PCRE regular expressions, so transferring it over might not be easy.

While client-side searching definitely has its advantages (as you said, faster response), the disadvantages (the complexity and the size of the data would mean basically freezing your browser while it's filtering) outweigh them in my opinion. Not to mention that this would make searching require JavaScript, which is not great for the people who browse with NoScript or similar (yes, those people exist; no, I don't know why they exist). Linking to a pre-defined search would also be harder (it's very easy right now, there's a link on every search result).
Originally posted by telinc1
Address filtering internally emulates the memory map (LoROM, SA-1, and Super FX all at once in the case of Super Mario World), so I'd pretty much have to send the whole database to achieve the same power.

Is that any different from disabling pagination?

Quote
Address filtering also pretty heavily relies on PCRE regular expressions

I'm not aware of any meaningful difference between PCRE and JavaScript regexes.

They're considered different flavors, so there are a few differences, but I doubt they're important.

--------------------
<blm> zsnes users are the flatearthers of emulation
Originally posted by Alcaro
Is that any different from disabling pagination?

Disabling pagination doesn't guarantee you get the whole database. It'd be more like turning off pagination and turning on all regions. Except that there's data I don't send to the client, such as the type of memory map (LoROM, Super Mario 64, or anything else).

Originally posted by Alcaro
I'm not aware of any meaningful difference between PCRE and JavaScript regexes.

They're considered different flavors, so there are a few differences, but I doubt they're important.

That's technically true. That wasn't really the best argument against client-side searching.
Originally posted by telinc1
Typing in the filtering fields has a 1000 ms debounce on it to prevent overloading the server with requests. If you switch to a page, click the Apply button, or press Enter, it'll be sent immediately.

While I did consider making searching client-side, it pretty much has to be server-side because of its complexity. Address filtering internally emulates the memory map (LoROM, SA-1, and Super FX all at once in the case of Super Mario World), so I'd pretty much have to send the whole database to achieve the same power. Address filtering also pretty heavily relies on PCRE regular expressions, so transferring it over might not be easy.

While client-side searching definitely has its advantages (as you said, faster response), the disadvantages (the complexity and the size of the data would mean basically freezing your browser while it's filtering) outweigh them in my opinion. Not to mention that this would make searching require JavaScript, which is not great for the people who browse with NoScript or similar (yes, those people exist; no, I don't know why they exist). Linking to a pre-defined search would also be harder (it's very easy right now, there's a link on every search result).


Well you can have both, a server side search and a client side search. The server side helps making searches linkable and NoScript users, while the client side helps with making the maps really pleasant to use. And in all seriousness, I doubt that the browser will freeze. I suspect that the size of the data you are dealing with is in the order of 1MB (https://www.smwcentral.net/ajax.php?a=getmap&p=nmap&m=smwrom gives 500kb) and that should not be a problem to analyze with current computers. At least I would not dismiss this idea before testing it.

--------------------
Your layout has been removed.
Originally posted by Horrowind
Well you can have both, a server side search and a client side search. The server side helps making searches linkable and NoScript users, while the client side helps with making the maps really pleasant to use. And in all seriousness, I doubt that the browser will freeze. I suspect that the size of the data you are dealing with is in the order of 1MB (https://www.smwcentral.net/ajax.php?a=getmap&p=nmap&m=smwrom gives 500kb) and that should not be a problem to analyze with current computers. At least I would not dismiss this idea before testing it.

I'm not going to bother making up arguments anymore. The reason I'm against this is simply due to the amount of work it would take and how unmanageable the result would be.

The current system (it's proprietary code which I am not willing to share with anyone other than the other site coders) is an interconnected object-oriented structure. The code just for searching alone is about 2350 lines, but I'd need to reimplement pretty much the whole memory maps in JavaScript for client-side searching to realistically work (so the total comes up to about 4340). Of course, measuring complexity in lines of code is rather idiotic (especially considering that everything is well-documented), but it gives a rough estimate of the size of it all. Making the whole thing client-side would not only require sending everything useful from the database (not that big of a deal, it's nothing too complex) but also porting the memory map system along with a considerable part of Kieran's proprietary SMW Central code. Seeing as JavaScript is publicly visible, I'm sure he'd be strongly opposed to that.

Even if the system is ported to JavaScript, you'd now have the same code written in two different languages (not to mention two of the most hated languages; JavaScript is actually my favorite language, but that's an unpopular opinion). Any change to it would require updating both the PHP code and the JavaScript code, which is clearly not very rational.

I considered client-side searching at the very beginning of this project, trust me. In the end, however, I decided that server-side searching would be the cleaner solution. Doing it client-side would probably improve searching times, yes, but not significantly. Memory map data is big and the delay mostly comes from the searching algorithm, which takes about 200 ms (the number is not random, I've benchmarked and timed everything to make sure it's as optimized as I can make it), not from latency. I'm clearly saying this without considering that JavaScript may or may not be slower than PHP based on how well V8 optimizes it (that statement, of course, requires an environment which uses V8). If we assume that JavaScript is going to be equally fast, then you're going to spend roughly the same amount of time searching (a little less if it's faster). In the end, you get a small speed increase and a lot more disadvantages, such as the effort needed to port the system to JavaScript and maintain the end result.

This is why I'm not keen on testing the idea. I know where the delay when searching comes from and it's probably quite safe to say that client-side searching won't eliminate it. It'll only complicate the whole thing, potentially make it less stable, and take hours of work.
This is totally fine. You are not a full time developer of these maps, and it is nice of you to improve them in your free time!

A last idea I propose for the mapping logic. It just occurred to me and I wanted to share it, this is not a request for you to actually do it. Try to precompute the memory map. What I mean with this is that you have a large array with an entry for every byte in the SNES address space. Each byte contains an index to a table with the actual data (Address, Type, Description, Details, etc.).

Then lookup via addresses is really easy (lookup in the first table to get the indices you look up in the second table). Lookup via description is just as easy as before, since you can use the second table for this.

24bit address space can address 16MB, if we assume a two byte index we have 32MB of data. I assume this data will compress really well with say gzip, since most banks are just unused and a transfer will probably be as fast as the current "All" page. On the client we can decompress with JavaScript. If you think that 32 MB are still to much to handle on the client, you can have a two stage lookup, first for banks and then for the address inside the bank to be a bit less demanding on user memory.

Pros: You only have to have address space logic on the server, where you need to recreate the first table only if there is a change to the map.
Cons: A naive approach will use 32MB of memory on the client. Which is probably okay with the amount of memory available on todays machines, but still a bit huge.

For SM64 you would need to come up with something different, since the address space is larger. Maybe a two staged lookup will of the address will suffice.

--------------------
Your layout has been removed.
> context filter for ram maps
O H Y E S

I'll try to remember to check this out more extensively later this week. Busy days lately.
My blog. I could post stuff now and then
Layout by Counterfeit.
Originally posted by Horrowind
Try to precompute the memory map. What I mean with this is that you have a large array with an entry for every byte in the SNES address space. Each byte contains an index to a table with the actual data (Address, Type, Description, Details, etc.).

That's actually a really good idea. Of course, memory map data is stored too differently for this to be applicable directly, but a similar concept could speed up address searching since it avoids looking at every individual address. This is something I should consider for the future, especially if filtering speed is degraded on the website.

Looks and feels pretty good, but why is the layout* handled using tables instead of CSS like my website or the VWF Dialogues Patch V1.2 Readme?

*Note: When I say layout, I mean why does <body> contain a single <table> element which is then used to lay out the entire page? It’s terrible for responsive design.

List of reasons why using <table>s for design is terrible:


P.S.: I can try to fix that <table> mess if the source is available on GitHub.

It uses tables because the layout is a copy of the rain theme, which was either created ages ago when tables where still a common way to do layouts or it just copied the HTML code from a different layout (maybe the default one, which seems to be mostly unedited for like a decade) and changed stuff until it looked good. Go complain in the site questions forum about the tables.
What randomdude999 said. The webpage I linked to is just a demonstration version. From the very beginning, I had planned to actually integrate it into SMW Central, not keep it a separate website.

If you take a closer look at the source code, most of the layout is directly copy-pasted from SMW Central and uses a mostly unmodified version of rain.css, which is the Rain scheme you can select from Edit Profile. Doing it properly would make no sense in this case because I'd either need to convert it into tables or make custom CSS for every possible site scheme.

As for why SMW Central uses a table-based design, it's because it was created years ago, back when Internet Explorer 6 was the standard browser and tables were pretty much the only way to position anything. Today, it looks quite dated, but still works.

By the way, I wouldn't need help to create anything without using tables. After all, most web designers can do that.
Pages: « 1 »
Forum Index - Important - Announcements - A Memory Map Thing (Project [redacted])

The purpose of this site is not to distribute copyrighted material, but to honor one of our favourite games.

Copyright © 2005 - 2018 - SMW Central
Legal Information - Privacy Policy - Link To Us


Total queries: 23

Menu

Follow Us On

  • Facebook
  • Twitter
  • YouTube

Affiliates

  • Talkhaus
  • SMBX Community
  • GTx0
  • Super Luigi Bros
  • ROMhacking.net
  • MFGG
  • Gaming Reinvented