I hate I2C for several reasons. It's only two-wires bus, but for this
reason it is insidious.
I usually use hw peripherals when they are available, because it's much
more efficient and smart and because it's the only possibility in many
cases.
Actually we have MCUs with abundant UARTs, timers and so on, so there's
no real story: choose a suitable MCU and use that damn peripheral.
So I usually start using I2C peripherals available in MCUs, but I found
many issues.
I have experience with AVR8 and SAMC21 by Atmel/Microchip. In both cases
the I2C peripheral is much more complex than UART or similar serial
lines. I2C Single Master, that is the most frequent situation, is very
simple, but I2C Multi Master introduces many critical situations.
I2C peripherals usually promise to be compatible with multi-master, so
their internal state machine is somewhat complex... and often there's
some bug or situations that aren't expected that leave the code stucks
at some point.
I want to write reliable code that not only works most of the time, but
that works ALL the time, in any situations (ok, 99%). So my first test
with I2C is making a temporary short between SCL and SDA. In this case,
I2C in SAMC21 (they named it SERCOM in I2C Master mode) hangs forever.
The manual says to write ADDR register to start putting the address on
the bus and wait for an interrupt flag when it ends. This interrupt is
never fired up. I see the lines goes down (because START bit is putting
low SDA before SCL), but the INTFLAG bits stay cleared forever. Even
error bits in STATUS register (bus error, arbitration lost, any sort of
timeout...) stay cleared and the BUSSTATE is IDLE. As soon as the short
is removed, the state-machine goes on.
Maybe I'm wrong, so I studied Atmel Software Framework[1] and Arduino
Wire libraries[2]. In both cases, a timeout is implemented at the driver
level.
Even the datasheet says:
"Note: Violating the protocol may cause the I2C to hang. If this
happens it is possible to recover from this state by a
software reset (CTRLA.SWRST='1')."
I think the driver code should trust the hw, between them there's a
contract, otherwise it's impossibile. For a UART driver, you write DATA
register and wait an interrupt flag when a new data can be written in
the register. If the interrupt nevers fire, the driver hangs forever.
But I have never seen a UART driver that uses a timeout to recover from
a hardware that could hang. And I used UARTs for many years now.
Considering all these big issues when you want to write reliable code,
I'm considering to wipe again the old and good bit banging technique.
For I2C Single Master scenario, it IS very simple: put data low/high
(three-state), put clock low/high. The only problem is to calibrate the
clock frequency, but if you a free timer it will be simple too.
What is the drawback of bit banging? Maybe you write a few additional
lines of code (you have to spit off 9 clock pulses by code), but I don't
think much more than using a peripheral and protect it with a timeout.
But you earn a code that is fully under your control and you know when
the I2C transaction starts and you can be sure it will end, even when
there are some hw issues on the board.
[1]
https://github.com/avrxml/asf/blob/68cddb46ae5ebc24ef8287a8d4c61a6efa5e2848/sam0/drivers/sercom/i2c/i2c_sam0/i2c_master.c#L406
[2]
https://github.com/acicuc/ArduinoCore-samd/commit/64385453bb549b6d2f868658119259e605aca74d