Archive for January, 2020

Reverse engineering the Sony BKM-129X

Sunday, January 26th, 2020

So I got a couple of PVM-9L2’s, both coming with the BKM-120D SDI card, but no analogue RGB inputs are present by default. They can be had by changing the BKM-120D with a BKM-129X card, which supplies standard RGBS/YPbPr in/out BNC sockets. The board is rather expensive, going in the $150’s range on eBay, although not that rare as such. The same board is used in other BVMs and some other PVMs can also use them, if not to add even more RGB inputs.

An original Sony BKM-129X RGB/YPbPr input card

Anyway, looked into if it was possible to make a clone of it, so I looked at the service manual, and could see that it was more complex than for instance the JVC card (IF-C01COMG) I cloned some time ago.

Here’s the MCU setup on the BKM-129X (from the service manual which can be downloaded on manualslib.com and most likely other places, it is shared with BKM-142HD, 120D and 127W).

It actually has a microcontroller/MCU on board, coupled to the monitor through the 64-pin socket, seemingly communicating on a SPI channel. SPI usually has 4 wires: a clock (SCLK), data master->slave (MOSI, master out, slave in), data slave->master (MISO, master in, slave out), and a slave-select (SS, meaning “selecting” the slave for communication, usually active low). So two data channels in each “direction” means SPI is “full duplex” which again means that data is flowing in both directions simultaneously (look up SPI on Wikipedia if you don’t know it). Slave-select is not present by default, but there is a SLOT ID pin going from the socket and into the MCU at two places, one called INT0, which is an interrupt pin, and another also called SLOT ID, so I guessed this to be functioning as slave select.

This data channel is then used by the monitor to verify the board is there and read some information from it and so on, thus activating RGB/YPbPr on said monitor, when the card is present (and selected). If the card supplies YPbPr or RGB is then setup in the monitor itself through the OSD, and there is also a serial number readout. At the same time, the microcontroller also has two “digital” outputs, one actually enabling the video output of the card (BX_OE, active low), and one (de)selecting external sync (EXT_SYNC_OE, active low).

To get anywhere I needed to get a card, so I bought one on eBay at too much money (the one shown in the beginning), which further fueled my desire to reverse engineer it, both for myself and the community.

The MCU of the board is some 8 MHz proprietary thing, and it has an EEPROM next to it, most likely with the program of the MCU on it, but since the MCU is not something easily bought, I didn’t try to read it out.

I expected the initialization sequence to be something like:
1. Monitor selects the card by setting SLOT ID to LOW.
2. Monitor reads some info including serial number.
3. Monitor decides card is there depending on response.
4. Monitor starts using card, sends commands commands to turn on and off the output and the external sync depending on button selection.

And then following commands for external sync and whatnot being something like:
1. Monitor selects the card by setting SLOT ID to LOW.
2. Monitor issues some “simple” command.
3. Monitor deselects card.

So to figure out the what commands look like, what response to give, and timings, I attached a logic analyzer to the RESET pin, MOSI, MISO, SCLK, SLOT ID, BX_OE and EXT_SYNC_OE pins. I have a DreamsourceLab DSLogic Plus USB logic analyzer, which can capture for a long period of time which is great, as I need to capture during the whole startup and initialization of the monitor to catch the information (which is around 5 seconds). At the same time, the software for it (DSView) also has built-in SPI decoding, so it can show me the byte representation of the data being transmitted over the wires. The connector is a DIN-41612 style a+b connector, where the bottom row is the a row, and the top is b. Pin numbers are counted from left to right. When the connector is vertical like it is positioned in the monitor, the leftmost column is then a row and rightmost is b, and pin numbers are counted from top to bottom.

So I set up the logic analyzer to capture for 5 seconds, and at 10 MHz (as SPI is often in the 1 MHz range this should be plenty oversampling). So doing this yielded a datastream like this CSV shows:

Complete CSV file of power-up sequence, card selected, external sync on.

Now SPI actually has 4 possible modes, which will influence how data is interpreted: clock polarity (CPOL) and when to sample the data (CPHA), both can be 0 or 1, thus 4 possible configurations. So to figure out if the representation of the data is correct, these need to be setup properly. First of all, it can be seen that the SCLK is high when nothing is happening/idling, meaning CPOL should be 1. To decide if CPHA should be 0 or 1, you need to look at the transitions of SCLK and the data lines in relation to each other. I decided CPHA should be 1 based on the waveform, meaning data is sampled on the rising clock edge. I then did this multiple times, with different sample speeds also, to confirm I got the same readings again and again.

The output shows that SLOT ID seemingly *is* used as a slave select, the monitor then issues a 0x9E command, the slave responds with the same, while the master seems to clock in an address byte, which then is responded to from the card with whatever that address contains. This is a “classic” memory read pattern. There also seems to be a 0xBE request, which also throws out some data, so I expect that it’s another memory region that is requested or something like that. It then has some 0xF7, 0xF6 and 0xF4 requests, which seems to be depending on if it is selected, and if external sync is enabled or not.

My current idea of the protocol is then like this:
0x9E – Memory read (followed by an address to read).
Example: 0x9E, 0x02, 0x00 reads memory address 0x02 (the last 0x00, is where the master clocks out the response), so response would be 0x9E, 0x39.
0xBE – Memory read of “another” region (followed by an address to read).
The next here are preceded by 0x00, 0x00:
0xF7 – Deselect (turn off outputs).
0xF6 – Selected, output enable (BX_OE=LOW/0), external sync disabled (EXT_SYNC_OE=HIGH/1).
0xF5 – Also some deselect, see it rarely.
0xF4 – Selected, output enable (BX_OE=LOW/0), external sync enabled (EXT_SYNC_OE=LOW/0).

Upon startup, the card always gets two deselect (0xF7) and then a one of the select commands, depending on the external sync state.

So my first attempt was simply implementing all this in an Arduino Nano, putting the protocol into a request/response state machine, that each time the monitor asked for a specific byte, would return the same as the original BKM-129X did. This proved not to work. I got the request, 0x9E, and it was sent back, but then SLOT ID goes HIGH, and never LOW again to read the response. Hot damn…

So I looked some more at the waveforms, and noticed that the same happens for the original card, however SLOT ID goes LOW again some time after the 0x9E response, and then stays LOW for quite some time, until SCLK suddenly starts clocking in the response data. Since this was not happening on my setup, I thought that maybe this was some part of a handshake, because of the SLOT ID labeled pin on the MCU itself. So my guess was that the MCU should slave-select itself when the actual response was required, sounds somewhat strange, but hey, it’s Sony. And whaddayaknow, it worked!

A single byte read. The first CS low is the monitor sinking SLOT ID, the second is the card.
Here’s a closeup of the initial part (representing the first CS low in the previous picture).
And here the second part, where CS is driven low by the cards SLOT ID pin. Look at the time in the top, around 10 ms has passed since the command was issued.

Doing this, I could now, having only the Arduino Nano connected, select RGB (no picture of course), read out a serial (that was identical to my original card) and select and deselect external sync. So to do it, SLOT ID pin is first configured as input, and when the actual response is needed, it waits for the SLOT ID from the monitor to go HIGH, then the pin is configured as an output, output is set LOW (thus slave selecting itself), waits for the response to be read out over SPI, and then it sets SLOT ID as HIGH, and back to an input.

Now I have no idea what most of the data reads represent, except it seems (at least) the first byte read, address 0x00, returns 0xC8 for the BKM-129X and 0xC0 for the BKM-120D, so I expect this is what the monitor uses to identify the card. The next addresses read, 0x02-0x09, represents the serial number, written in ASCII bytes (so 0x30 for 0, 0x31 for 1, 0x32 for 2 and so on, www.asciitable.com). Looking at the CSV from earlier, it can be seen that my original card has the serial number 2001911 (0x32, 0x30, 0x30, 0x31, 0x39, 0x31, 0x31).

The PVM-9L2 with just the Arduino attached.
I changed the serial number to 5001337, and it reads fine.
The Arduino connected in the back, including the 100 Ohm and the 100K resistor for at the SLOT ID pin and including a BC550 to allow the monitor to reset the MCU.

I’ve put the code, which works on an Arduino Nano v3, on my github: https://github.com/skumlos/bkm-129x-mcu/

Update

So since the code now has been refactored and changed a bit, and tested on more monitors, at least newer monitors like the BVM-D9H and D14H seem to require that the bootloader is removed and the application is running as the only piece of software (seemingly to boot faster). This can be accomplished by programming the software directly to the Arduino, without using the USB. This means a “real” programmer (ISP) is needed, and the ICSP header (or the corresponding pins) are used for programming. A cheap ISP can be made with another Arduino Nano, check out how to do that here: https://immerhax.com/?p=480 (scroll down a bit)

Any feedback is appreciated, please email to martin@hejnfelt.com or reach me on shmups (username: skum), on the CRT discord (username: skum) or on Facebook (Martin Hejnfelt).

Next up is designing the rest of the board to have a true compatible card. I am in the last stages of this, and will release here and github when done.

SEGA Mega Drive 2 50/60 Hz and language mod (single switch)

Friday, January 3rd, 2020

By default, the European Mega Drive (2) has a vertical refresh rate of 50 Hz, meaning the games play slower than its Japanese and American counterparts which play in 60 Hz. We don’t like that, so that needs to be fixed! So getting a Mega Drive 2 to play games in the original 60 Hz speed plus being able to switch the language on games that support it is somewhat easy, but there are some caveats, especially if you’re a purist. This guide relates to a PAL Mega Drive 2 with a VA1 board. Note that I’m still a sucker for hardwired switches, but a lot of this can also be done with a MegaDrive++ setup.

Getting 60 Hz out of the MD2

First of all, what vertical frequency the Mega Drive generates is depending on the voltage on pin 46 of the main CPU. When pin 46 is tied to 0 V/GND, it generates 50 Hz, while when it is logic high/5 V, it generates 60 Hz. On this board pin 46 is tied to GND not far from the main chip. This small bridge as can be seen below, needs to be cut.

Where to cut when isolating pin 46
The connection to the GND plane is now cut

When the trace is cut like this, we isolate the pin. It is supposed to have a pull-up and if you measure between 5 V and the pin, you’ll be able to measure around 25 kOhm, so indeed it seems there is something here. At the same time, the trace is also connected to the MB3514 video encoder, which generates composite video and amplifies RGB. It is connected to pin 7 which (lucky for us) is also the pin that selects if the MB3514 should produce NTSC or PAL composite signal (and chroma in S-Video), 0 V for PAL, 5 V for NTSC. We’ll get back to this later.

Close to the power regulator (the heat sink thing) are 4 holes, which denotes two jumper points, JP3 and JP4. The left side is connected to the above trace, meaning to pin 46 and MB3514 pin 7. The right side is connected to GND for JP3 and 5V for JP4. Since the holes are through holes (most likely filled with solder) these are some good points to connect to. Now connecting 5 V to the left side will make the MD2 create 60 Hz video, and MB3514 create NTSC, while connecting 0 V will give 50 Hz video encoded to PAL. So connecting a switch here that selects between the two by connecting 0 V or 5 V from the right side makes it easy to switch between the two modes.

The jumper pads that makes connection easier

Now if we try to connect 5 V to one of the pads om the left side, we will see that the video is now full screen and running noticeably faster. However, if we are using composite, we might be getting a black and white image!

Dafuqs up with my black and white picture?

Usually people will attribute this to being because the TV or monitor the machine is connected to, cannot decode NTSC, and then switch to RGB. The decision is good (composite sucks), but for the wrong reasons (especially if we know the set to be NTSC compatible). So why are we seeing black and white composite video? The MB3514 is properly set to generate NTSC, however the chip also has a carrier frequency input (pin 6) which need to match the specifications the receiver expects.

The MB3514 and the two pins for NTSC/PAL and carrier frequency

For PAL this frequency is ~4.43 MHz and for NTSC it is ~3.58 MHz. Since these subcarrier frequencies are used to encode only the colors of the video signal, and we are getting black and white, something is off. If we try to measure the frequency going to pin 6 we see this when in PAL mode:

MB3514 carrier frequency in PAL mode
The average frequency, 4.434 MHz

Note the Measurement in the bottom where it says “Average” and “4.434 MHz”, so here the carrier frequency is correct.

If we switch to 60 Hz / NTSC mode, we get this:

MB3514 carrier frequency in NTSC mode on PAL console

So we can see we have a carrier frequency of average 3.547 MHz, which is somewhat off the intended ~3.58 MHz, so this is why we are seeing a B/W picture; the receiver cannot lock onto the carrier frequency to decode color.

We can then deduce that the carrier frequency is created wrong, and since the frequency is created by the main CPU, the answer to why, can be found in the differences of the main oscillator which supplies the clock signal for the system. On PAL consoles this is ~53.20 MHz (which is equal to roughly 12 x 4.43 MHz), while on US and Jap consoles it’s ~53.69 MHz (which is equal to 15 x 3.58 MHz). Now what happens is, that when pin 46 is low (for 50 Hz PAL), it divides the clock by 12, and when driven high (for 60 Hz NTSC), it divides by 15 (53.20 MHz / 15 is roughly 3.55 MHz, which matches what we saw above). Now how to fix this?

Bring in the Dual Frequency Oscillator (DFO)

The Dual Frequency Oscillator is a small device based on Texas Instruments CDCE family of clock generators. It can be programmed to generate different clock signals depending on an input. A guy called micro has made a design that is a drop in replacement of the main oscillator (https://oshpark.com/profiles/micro) so building one of these, replace the original oscillator on the board, take the same signal as we send to pin 46, and insert at the DFO, we can now switch the main oscillator frequency to match the selected mode!

So we need to desolder the main oscillator and dump the DFO in. We can then take the signal to the DFO from the other unused through hole at either JP3 or JP4 and route that to S0.

The DFO installed

Doing this, we can then measure that the subcarrier frequency is now correct when in 60 Hz mode:

The white wire goes to S0 of DFO, the green to the switch, red/black to switch poles.
Fixed MB3514 carrier frequency

At the same time, this also fixes that the output frequency in fact was not 60 Hz, but more 59,2 MHz, which can help when connecting to some flatscreen TVs an converters like Framemeister. Note that much of this actually relates to many other 50 Hz PAL to 60 Hz NTSC conversions on Master System, PlayStation and so on.

So what about that Japanese language?

Pin 107 of the main chip selects if the machine should report as being an English machine or a Japanese machine. This “helps” some dual language games to show either Japanese or English text. An example is Quackshot. When pin 107 is connected to logic high (5 V), the machine reports English, when 0 V it reports Japanese. On this board, pin 107 is connected to 5 V through a via to a 5 V plane. This somewhat sucks, as we then either have to lift the pin, cut the trace, or cut around the via in the plane. I selected the second option: Cut the trace at the via and then scratch some of the solder mask off to be able to solder to the trace.

The trace to pin 107 is cut
The yellow wire soldered to the trace to pin 107

Bringing it all together

Now we have two places to insert information, the 50 / 60 Hz and the language. Some use two switches for this, one for each, however we can deduce that only 3 sane options exist:

50 Hz / English / Pin 46: 0 V, Pin 107: 5 V
60 Hz / Japanese / Pin 46: 5 V, Pin 107: 0 V
60 Hz / English / Pin 46: 5 V, Pin 107: 5 V

This can be done by a simple DP3T switch, so instead of having two switches, which techically gives the 4th “invalid” option of 50 Hz / Japanese, we hook up to one with 3 positions. The hookups look like this:

R1 is to prevent shorting the supply when switching while machine is on. It can be 1K-10K but it needs to be less than the internal pull-ups (which seems to be in the 25K range as mentioned earlier).

The DP3T, still missing language wire here

So I like to conceal the switch as much as possible which I do by using the left side facing the switch downwards.

The installed switch

Of course some mechanical holes needs to be drilled and filed in. How big this is depends on your switch.

The holes for the switch
The switch seen from the bottom

So now, by switching we can select all three regions with correct corresponding frequencies. All good!