s23021536: Sure why not. After all, we are all androids, albeit biological and very sophisticated. Problem is though, that feelings will have to be emulated and will therefore be fundamentally of a different nature than ours.
Of a different nature, certainly, but psychology has advanced far enough in the past 150 years that an emotional response to given stimuli can be fairly accurately predicted. Even if an android wouldn't actually process emotions in the same way that our biological brains do, the emotional response could (theoretically) still be rendered indistinguishable from that of a human being.
I suppose 'emulated' is really the best word for it. We emulate a Commodore 64, and to all extents and purposes, the result that we're playing on the screen is more or less indistinguishable from a real C64 for most people. An android could theoretically do the same with emotions.
However, a C64's SID audio hardware has real wavetable oscillator and a real audio filter. An emulator replicates the effect of these, but doesn't really "filter" or "generate" the sound waves. There are certain differences between an emulated C64 and a real C64 that are imperceivable to most people. Sure there are audiophiles that can hear the difference, and other people are inevitably bothered that it's not the real deal, despite not actually perceiving the difference.
Likewise, psychologists will invariably be able to notice minor 'bugs' and 'inaccuracies' in an android's emotional processing, and other people will be made a little uncomfortable by the fact that the emotions aren't real. In the latter case, there doesn't necessarily have to be a problem with the android's emotional processing. Rather, the problem lies in the eye of the beholder. Many of a real human's emotional responses are based on compassion and empathy, or on cruelty and vindictiveness, and difficulty in being to feel compassion or empathy with an android will inevitably influence a relationship between a human being and an android.
And, as awalterj said, there's the issue of the uncanny valley, where despite there being more or less 100% emotional response accuracy, there will be certain things that just don't 'seem' right.
(By the by, anyone remember the old STTNG storyline about Data and his emotion chip, which was carried on into Star Trek: Generations? I think that chip was supposed to be essentially a 'hardware emulator')