Worklog N64 Overclock Fixes

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
Hello all. I'm back on the warpath again thanks to a discussion with a friend, Hypatia.

Generally this would have been an Assembler post for me, but they don't exist anymore, and BitBuilt seems to have a good collection of people who have the very special knowledge required to work through this sort of thing :)

Spurned on by the video by ElectronAsh, showing an apparently overclocked N64 with graphical glitches corrected by a FPGA board, I decided to investigate if a UltraHDMI would yield similar results.

Given that I'm here in New Zealand and suffer from our 'fresh air and sunny beaches tax' I opted to simply use the clock from a rotten N64 board that I had to do the X2 overclock as it was technically acceptable, and was hoping that bypassing the DAC with the UltraHDMI would yield fewer/reduced visual glitching.
BoardWithOverclock.jpg

But I've now got a rotten disposable N64 board with an UltraHDMI ribbon attached to it.

So where to from now: I understand that this could be from two different issues at its core. It's either a failure at the ram-read speed or a mistiming issue as suggested by ElectronAsh. What I'd like is some insight from the minds here to see if I can start to replicate the results that they posted.

Supposing that this is a ram latency issue I do have two Toshiba TC59R1809HK chips here that I could try swapping out as an experiment. These boasted lower latency than the stock N64 RDRam.
 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
Unsurprisingly the "improved" ram didn't give different results.

I think I was hoping that the VI would read from the framebuffer better given reduced latencies and improved tolerances, but it'd appear that the solution to the issue doesn't lie in that part of the system.
 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
Ok, so I setup some headers on the clocks and had the time to grab a 20.0mHz clock and test it out. This naturally garbled the output video but to my surprise this was fixed by the UltraHDMI!

This meant that I was able to do a couple of tests by increasing the clocks in tandem:
X1: 20.0
X2: 17.7

20_17_UltraHDMI.jpg


Naturally the setup is unstable as all hell and if it's bumped, or a tiny bird pokes it, then it falls over. It also heats up very very quickly with the RCP becoming scalding to the touch after 10 minutes. Audio is predictably increased in both pitch and speed.

Still I'm thrilled to have been successful with this little experiment and next I might aim for the 1x speed downclock on the CPU and a matching upclock on both crystals just to see if it's possible!
 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
So I recently wondered if the video would work were the chip set to the correct clockrate. Finding myself short on the necessary bits to do this correctly I opted to simply jumper from one N64 to the other.

Suffice to say that it didn't work. Could have been my Michael Mouse setup, difficult place to drop a wire (someone put an UltraHDMI cable over the chip! How rude!) or simply just a stupid idea.
 
Last edited:

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
So the assortment of clocks I ordered arrived today.

I could push X1 up to 25.0mHz but no higher as the console wouldn't boot.
X2 wouldn't boot at 20.0mHz and that was the next up from 17.7mHz that I had.

So now within the realm of a 1.5x speed I checked the timing for comparison and found it wildly out, with a 26 second video I took capturing 31.85 seconds of "game time" which suggests that things aren't quite linear. I dropped a switch on the CPU to give some control between 1x and 1.5x and didn't feel that the 1x provided better performance. Interestingly at 1x speed the "game speed" as demonstrated by the clock in PD was ever so slightly slower than actual time with a 26 second video capturing 20.55 seconds of game time.

Frustratingly I still can't get the Everdrive to work, and even then it wouldn't be a good measure of performance improvement as both clocks have been changed so this is left sitting in the realm of non-qualitative measurement, ie. videos and guesswork.


And you bet fluffy helped out, and I think that's why there are green dots in the HDMI feed - I think he pulled on the ribbon a bit too much.

So...drawing to the end of a long road for me, and I really don't feel that I've found a "win" at this stage. I'm sure I could get a 20% boost without corrupted video if I could use the Everdrive and adjust the gameplay speed of Perfect Dark or Goldeneye directly, but the Everdrive just won't boot.

I think for the next part of this experiment I'll see about looking at the different X2 clocks and at what point the video tears begin to happen - I might be able to squeak out a little performance that way, and a little is better than nothing.
 
Joined
Dec 29, 2016
Messages
97
Likes
36
This is really interesting. I remember a post from a couple of years back that tried similar things. That was on assembler games, so it’s gone now. Here is a link to another forum which sums up a lot of the info.

 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
EB1560 did some fantastic stuff when probing the ram timings and really helped us rule out that much of the equation.

What I wanted to test here was overclocking the crystals by 50% and then dropping the CPU down to the 1x speed. I can't get any good tests besides a video as I truly am dirt poor and cannot afford any other solution, but sadly I feel that the gain wasn't worthwhile here. In saying that it's really interesting to note that the UltraHDMI can be used to get around the video timing issues in the MAV-NUS chip and that increasing both clocks in tandem removes the VI/Framebuffer "tearing" experienced when overclocking the RCP/RAM. Interestingly the everdrive cannot boot when the X2 chip is clocked up too far.

No magic gains, no amazing results, years of pursuit and I think my final word on the matter is: Wait for the FPGA solution ;)

I've also got a hell of a weird board here that I'm hesitant to return to stock now, as someone might have some really good idea!
 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
I don't know that that'd help much - we could manually count fps I guess but it wouldn't surprise me if the UltraHDMI buffered and maintained a consistent 60fps only changing when the video changes from the N64 itself.

Cool little gadget however!
 

Miceeno

.
2020 3rd Place Winner
Joined
Nov 30, 2016
Messages
153
Likes
281
Portables
6
The GM73V1892AH16L RAM is supposed to have a low latency mode. I'd offer to send you some that I've got on hand but it will probably cost more ship, import, and invert to your side of the world than for you to source local. The big red top and black to Ram Expanders have the low latency chips.
 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
The GM73V1892AH16L RAM is supposed to have a low latency mode. I'd offer to send you some that I've got on hand but it will probably cost more ship, import, and invert to your side of the world than for you to source local. The big red top and black to Ram Expanders have the low latency chips.

I did swap out with the Toshiba ram as a test to see if I could push the ram clock a bit further, and remember EB1560 doing the latency tests using that ram over at Shootersforever.

Logically the test I could try with low latency ram is to see if reducing the latency would reduce the tearing on the video, but given that increasing both clocks in ratio results in no tearing I don't think that's the result of a shortcoming in the ram but more a synchronisation issue between components thus a lower latency ram probably won't improve anything there.

Do you feel it's worthwhile just to cover all bases?
 

Miceeno

.
2020 3rd Place Winner
Joined
Nov 30, 2016
Messages
153
Likes
281
Portables
6
You'd have to check the data sheet but something makes me think you have to enable the low latency mode in software. So you'd have to patch the roms for the ram swap to make a difference.

I can't remember if flash carts were around when that post was made on shooters forever. If they weren't then it would have been harder to enable the low latency mode (if it has to be done in software).

I can't say if it will improve anything. I think it's more about hot-roding for the sake of hot-roding.
 

MRKane

.
2021 3rd Place Winner
Joined
Nov 5, 2017
Messages
405
Likes
488
I guess the next catch 22 is that if either clock is increased too much then my Everdrive won't boot which kind of undermines the entire process really. Frustratingly I also couldn't enable debug menus as a result of this.

The point of attack that interests me at this stage would be improving the VI read and framebuffer fill to try to remove the tearing from the screen, but even then we're throwing off a balance that games are programmed for.
 
Top