Hello,
My test environment (home lab rather) conists of the following:
1 x Synology DS1812+ Diskstation
2 x Whitebox ESXi 5.0 systems (5.0.0, 469512)
3 x network interfaces (1 x onboard Realtek 8111E, and 1 x PCI-e Dual Port Intel 82571EB gigE).
1 x Unmanaged Netgear 8 port switch that everything is plugged into.
Unfortunately I am having a lot of issuses setting up iSCSI multipathing.
I have configured the Synology's LAN1 and LAN2 interfaces with 2 static IP addreses (which are on the same subnet for testing) then ensured I can ping both. I then configured iSCSI and then added the datastore to both my hosts. I configured Round Robin so both paths show as Active I/O. But when I perform operations like copying files or disk speed benchmarks on a VM and look at the network util graph on the Synology, I only see adapter LAN2 one in use. There is no traffic going to the LAN1 adapter at all: http://i.snag.gy/jIf5i.jpg
So I did what any good IT guy would do, I turned off the diskstation, then turned it back on. Voila, both adapters accepting data and sequential I/O speeds of 200MB+ per second. Great!
I then left it overnight, and when testing the following morning, I have found that LAN1 adapter is getting ALL the traffic, and LAN2 is getting nothing: http://i.snag.gy/CqE65.jpg
I am quite positive the issue is with the Synology and not vmWare -- I have managed to reproduce this fault with CIFS also by configuring a shared folder, then from 2 computers on my network (Running Windows 7) accesing the Synology with 2 different IPs and then copying files into the shared folder. It seems to work fine after a reboot for a while (exact period unknown), then its like one of the interfaces goes to sleep, and never wakes up until another reboot.
Any ideas on how to rectify this?