Recently I stumbled across this post by Mark Osborne. Using this method, it is relatively easy to use the USI in the ATtiny to accomplish an UART as the ATtiny doesn’t have an UART interface by itself. Using the USI has the advantage that it is possible to achieve higher baud rates and less CPU overhead than when using a pure software serial. In this post I will assume you have read the article so I won’t go into much detail of the implementation. Also I will be limiting myself to the transmitting side only.
Why so slow?
So I implemented the USI Serial Send solution using an ATtiny85 clocked at 1 MHz. My goal was to see how far I can push the baud rate when the chip is clocked at it’s lowest clockspeed. After fixing a minor bug in the example code, I had a nice working UART at a baud rate of 2400. However increasing it to 4800, made the terminal spit out garbage. If we simply divide the clock speed by the baud rate, we see that the chip has 208 clock cycles per bit available. This should be way more than needed. Even we purely bit bang in software we can manage to do this within 208 clock cycles. So what else was going wrong here?
As Timer0 is used as the clock for the USI, I assumed the problem must be in there somewhere. This timer is also used by the Arduino Core if you use that, so there will be collision there as described in the post. Adjusting the timer properties while it is running is somewhat dangerous so I improved the code to stop the timer while we set all the properties and start it only after everything is set. This can be done by setting the
PSM0 bits in
GTCCR and clearing them when we are done.
// Configure Timer0 GTCCR = (1<<TSM)|(1<<PSR0); // Put Timer0 in sync mode TCCR0A = 2<<WGM00; // CTC mode TCCR0B = CLOCKSELECT; // Set prescaler to clk or clk /8 OCR0A = FULL_BIT_TICKS; // Trigger every full bit width TCNT0 = 0; // Count up from 0 GTCCR = 0; // Disable sync mode
Can we go even faster?
Now I could already increase the baud rate to 9600. Already an improvement of 4 times! However increasing the baud rate to 19200 caused problems. The terminal was spitting out garbage again.
To my suprise this was caused by interference on the reset pin of the ATtiny85. As soon as I touched the pin myself or as soon as I touched any ground, the serial was working fine. Took me a while to figure that out. Why this is happening, is still unknown to me. It might have something to do with the fact I don’t have any grounded power outlets in my house or maybe because I am running everything from USB ports out of my PC. Either way, this was solved by pulling all unused pins down to ground in the setup.
// Pull all unused pins down to ground pinMode(0,OUTPUT); pinMode(2,OUTPUT); pinMode(3,OUTPUT); pinMode(4,OUTPUT); pinMode(5,OUTPUT); digitalWrite(0,LOW); digitalWrite(2,LOW); digitalWrite(3,LOW); digitalWrite(4,LOW); digitalWrite(5,LOW);
Pushing the limits
While interference also wasn’t a problem anymore, it was time to figure out what my theoretical limit was for the baud rate when running at 1 MHz. The USI can be clocked at a maximum speed of
CLK/2. So this is definitely not the limiting factor. For sending a byte we need to send two packets (again: read the article). We prepare the second packet when the overflow interrupt has fired. However in the mean time the clock is still running. So our code must be done preparing the second packet and placing it in the USI before the clock has ticked.
After compiling the code as it is right now and disassembling it afterwards, I calculated that it takes (at least) 41 clock cycles before we have our next packet ready. This consists of 4 clock cycles for our interrupt to fire, 3 to jump to the interrupt code and 34 to execute the code itself. It can take however longer than this if the previous instruction before the interrupt was a multi-cycle instruction as this instruction will be finished before the interrupt. Dividing 1 MHz by 41 gives us a potential baud rate of 24390. The nearest lowest well-used baud rate is therefore 19200. So we were already at our limit. However 8 times faster than the original code.
Higher clock speeds
Now we know how to calculate the theoretical limits, we can calculate this as well for chips running at 8 and 16 MHz.
|Clock speed (MHz)||Theoretical limit (baud)||Practical limit (baud)|
I tested all above baud rates and they seem to be working as intended.
Well, this is weird
Just for fun I tried the code at a baud rate of 38400 and 57600 as well. Turns out the code still works. The terminal didn’t produce any unreadable characters at least. However without a oscilloscope, it’s very difficult to check the signal for errors and figure out why I am still getting seemingly valid results.
You can find this slightly modified example in my Github fork.
I am using attiny85 for my project it is possible to send a data at 115200 but reception at 115200 is not possible reading some garbage value at these baud rate. I am using only Serial.read function to receive a byte not using usi
If using the Serial.read function, you’re using the SoftwareSerial library. I am not really sure how the SoftwareSerial library is implemented, but I guess it’s bit banging in software. Then it all depends how many clock cycles it takes to do that.
Maybe if you’re running your ATTiny85 at 16 MHz this is possible. However for this article I limited myself to an ATTiny85 clocked at 1 MHz and extrapolated the results for chips at 8 and 16 MHz.
As bit banging is likely not nearly as efficient as this implementation, and with this implementation the theoretical limit is already 390243 baud at 16 MHz, I doubt if the SoftwareSerial is reliably sending your data. You should use a scope to verify that.
But the fact that you already say that receiving is not possible at this speed (and keeping in mind that receiving does take practically the same order of time to do as sending) already shows it’s not reliable.