Back when dinosaurs roamed the earth, I worked at a place where one of the dev guys was from Russia, and when you asked if he got his code right, he'd say, in a very thick accent, "seems working".
Pretty soon every other tech started saying the same thing whenever they were ready to move on.
Boss: "Did you get that rack built out OK?" Tech: "Seems working".
Boss: "Did you review the code for the new firewall's security policy?" Tech: "Seems working".
Boss: "Is the CEO OK with his new home theater the shareholders built for him?" Tech: "Seems working."
This time I laid out a revised minimalist Atmel 328 development board to experiment with overclocking.
Yep, seems working. At least for blinking an LED.
From various web pages I found (here and here for instance), overlocking these ubiquitous 8 bit MCUs is pretty common, and at least one user went all the up to 50Mhz--that's 30Mhz over what the 328 is designed to run--and lived to tell the tale--but many users caution that overclocked things work until they don't.
PCBs for the overclocking project come from audiodiwhy's generous sponsor, PCBWAY. Please help out this blog and check 'em out. |
I wanted to mess with the basics, having never overclocked anything before, so I laid out a slightly larger footprint "AVR minimalist development board (post about original design is here) to accommodate basic experimenation.
Why bigger? I wanted to be able to get my finger onto the MCU chip to see if was heating up and be able to quickly get to the crystal to desolder it and drop in something faster (or slower).
For experiments sometimes bigger is better, right?
As far as I can tell, after letting a main.c blink code run for about 20 minutes, with a 30Mhz crystal, the Atmel 328P chip wasn't heating up; it didn't seem to be drawing appreciably more current either (I powered the on board regulator using a Siglent 3303 at 9VDC, 100mA max). This seemed almost too easy.
You can get a box of crystals cheap (I paid about $10USB for what you see here) from AliExpress. Hours of fun.
The populated board is built from common junk-box type parts. It's the same schematic used in the dirty digital LFO, with a larger footprint.
The ZIF socket (chunky blue thing, top right), which would have allowed super quick swaps of Atmel 328's to see if chip A survived overclocking better than chip B, was the idea didn't pan out. The drills I designed into this post's PCB, to accommodate the ZIP socket, weren't big enough in diameter, so I ended up using a normal 28 bit skinny socket for this build. For experiments like this a ZIF socket would have helped, but oh well.
I have already laid out a REV3 board with bigger drills to accommodate the ZIF. I will get that fabbed as well, um....sometime.....eventually?
Overall you can barely see a 30Mhz xtal indeed runs a blink C program at almost twice the speed of a 16Mhz xtal board with similar design, parts, and MCU. the LED on the left is flashing faintly....junk box!
Just as expected.
Here is the test code used:
#include <avr/io.h>
#include <stdio.h>
void Delay(void);
int main(void)
{
//DDRB = 0x20; /* set bit 5 of DDR register which makes PB5 an output */
DDRB = 0b11111111; // 1 is output
while(1)
{
PORTB = 0b11111111; // switch LED on
Delay();
PORTB = 0b00000000; // switch LED off
Delay();
}
}
void Delay(void)
{
volatile unsigned long count = 150000;
while (count--);
}
The Dirty Digital LFO needs a faster output frequency, and it's arguably easier to put in a faster crystal and overclock it than do a bunch of code optimization. But will it still work? Will ADC get screwed up? Will the AVR chip turn cherry red? Will I burn down my lab? One of these days I will drop a faster crystal in there, change the FCPU variable, and see what happens.
For now, time to move on. At the very least, use a 20Mhz xtal in your C work right?--the AVR is designed for that, even though the Arduino Uno R3 uses 16Mhz.
You can get this post's overclock experimenters board, BOM, PDFs of board layout, eagle files, etc., from PCBway's, project site, go here. Have fun--quickly!
No comments:
Post a Comment