Your assumption here is that the tail wags the dog, whereas I think it's really the dog that is wagging the tail. The real question imho is not "When it has became common to use it to offense people instead of describing them?" It is: "When did society stop embracing open racism?" Or for that matter, sexism.
For an ancient Greek, unsophisticated savages were everyone non-Greek -- including Romans. They were lumped together as barbarians. Racism was open and perfectly normal. As was sexism: they valued slaves more than women.
These traditions and ideas were probably not novel then. They thrived in European culture until last century.
Fast-forward to colonial era revolutionaries in the US or France, and mentalities had not evolved the slightest bit. It was a given to everyone that an African was barely human.
If anything, it was about to take a turn for the worst, since the colonization of Africa and Asia had barely begun. Picture the mindset: how could these savages who sell each other to us as slaves be in any way related to us?
(The significant exception in Africa, one that got lip service respect and escaped colonization before WWI, were the Ethiopians; because their country was Christian for as long as anyone could remember.)
By the late 19th century, some new ideas has emerged; some good, some bad.
The idea of evolution had been introduced by Darwin, Marxists were promoting equality amongst men (and women) of all races, and many a progressive 19th century thinker had embraced some or all of these ideas.
But then, these emerged in the heydays of scientism and rationalism, in a world driven by reactionary thoughts. Then millennia-old ideas such as the racial inferiority of Jews, gypsies, blacks, women and others were suddenly "rooted" in science by Aryanists and what not: a "nigger" was deemed closer to the monkey than the white was.
The process reached its climax in Nazi death camps.
Upon waking up to these events, the powers that be then -- finally -- decided that enough was enough.
Still, society didn't stop being racist overnight.
In Europe, we passed laws to criminalize racism, and went on witch-hunts after WWII. In the US, it took Martin Luther King and the changes that followed -- all the way to positive discrimination -- to set things straight in a society that had fought the world's first modern war to ban slavery but was otherwise indifferent with segregation almost a century later.
As these events from the 19th forward unfolded, historical terms such as nigger progressively became connotated negatively. Especially after Martin Luther King. I'm tempted to conclude that the rest is history; unfortunately, it is but a work in progress.