I've been assigned a piece of work involving finding the temperature
coefficient of a trace on an IC die. The trace is ti-w and I'm using some
pretty sophisticated test equipment. (I can't give too much info because
it's a process at the company where I work) I insert a signal at 100mV and
measure I. I use these in ohm's law to find the resistance. I do this at
increasing temperatures. The problem I'm having is that the resistance
values are all over the ballpark. They show an increase in resistance
linearly with temperature, but, not unless I average the results. Even
then, I get anomalies in the slope. Is there a formula to figure the
temperature coefficient of the ti-W at different ratios of ti to W?