WARNING - OLD ARCHIVES

This is an archived copy of the Xen.org mailing list, which we have preserved to ensure that existing links to archives are not broken. The live archive, which contains the latest emails, can be found at http://lists.xen.org/
   
 
 
Xen 
 
Home Products Support Community News
 
   
 

xen-devel

[Xen-devel] Xen 3.0.4 migration failures

To: xen-devel <xen-devel@xxxxxxxxxxxxxxxxxxx>
Subject: [Xen-devel] Xen 3.0.4 migration failures
From: John Byrne <john.l.byrne@xxxxxx>
Date: Fri, 05 Jan 2007 16:52:53 -0800
Delivery-date: Fri, 05 Jan 2007 16:52:49 -0800
Envelope-to: www-data@xxxxxxxxxxxxxxxxxx
List-help: <mailto:xen-devel-request@lists.xensource.com?subject=help>
List-id: Xen developer discussion <xen-devel.lists.xensource.com>
List-post: <mailto:xen-devel@lists.xensource.com>
List-subscribe: <http://lists.xensource.com/cgi-bin/mailman/listinfo/xen-devel>, <mailto:xen-devel-request@lists.xensource.com?subject=subscribe>
List-unsubscribe: <http://lists.xensource.com/cgi-bin/mailman/listinfo/xen-devel>, <mailto:xen-devel-request@lists.xensource.com?subject=unsubscribe>
Sender: xen-devel-bounces@xxxxxxxxxxxxxxxxxxx
User-agent: Thunderbird 1.5.0.9 (X11/20061206)

Hi,

With the Xen 3.0.4 I've built from source, migration is failing and the domain vanishes. (Either live/non-live.) The output from "xm dmesg" and xend.log follows. Any ideas?

Thanks,

John Byrne

xm dmesg reports:

(XEN) mm.c:551:d0 Bad L1 flags 80000000
(XEN) mm.c:828:d0 Failure in alloc_l1_table: entry 194
(XEN) mm.c:1685:d0 Error while validating mfn 12eeb3 (pfn 1608) for type 20000000: caf=80000002 taf=20000001
(XEN) mm.c:976:d0 Failure in alloc_l2_table: entry 64
(XEN) mm.c:1685:d0 Error while validating mfn 11470e (pfn 3ad) for type 40000000: caf=80000002 taf=40000001
(XEN) mm.c:1039:d0 Failure in alloc_l3_table: entry 0
(XEN) mm.c:1685:d0 Error while validating mfn 11470a (pfn 3b1) for type 60000000: caf=80000002 taf=60000001
(XEN) mm.c:1960:d0 Error while pinning mfn 11470a

xend.log on the target machine reports:

ib/xen/bin/xc_restore 4 1 133120 1 2
[2007-01-05 18:47:17 xend 3206] INFO (XendCheckpoint:247) xc_linux_restore start: max_pfn = 20800 [2007-01-05 18:47:17 xend 3206] INFO (XendCheckpoint:247) Increased domain reservation by 82000 KB [2007-01-05 18:47:17 xend 3206] INFO (XendCheckpoint:247) Reloading memory pages: 0% [2007-01-05 18:47:24 xend 3206] INFO (XendCheckpoint:247) Received all pages (0 races) [2007-01-05 18:47:24 xend 3206] INFO (XendCheckpoint:247) ERROR Internal error: Failed to pin batch of 31 page tables [2007-01-05 18:47:24 xend 3206] INFO (XendCheckpoint:247) Restore exit with rc=1 [2007-01-05 18:47:24 xend.XendDomainInfo 3206] DEBUG (XendDomainInfo:1483) XendDomainInfo.destroy: domid=1 [2007-01-05 18:47:24 xend.XendDomainInfo 3206] DEBUG (XendDomainInfo:1491) XendDomainInfo.destroyDomain(1) [2007-01-05 18:47:24 xend.XendDomainInfo 3206] ERROR (XendDomainInfo:1500) XendDomainInfo.destroy: xc.domain_destroy failed.
Traceback (most recent call last):
File "/disk2/xen/xen-3.0.4-testing.hg/dist/install/usr/lib/python/xen/xend/XendDomainInfo.py", line 1495, in destroyDomain
    xc.domain_destroy(self.domid)
Error: (3, 'No such process')
[2007-01-05 18:47:24 xend 3206] ERROR (XendDomain:1001) Restore failed
Traceback (most recent call last):
File "/disk2/xen/xen-3.0.4-testing.hg/dist/install/usr/lib/python/xen/xend/XendDomain.py", line 996, in domain_restore_fd
    return XendCheckpoint.restore(self, fd, paused=paused)
File "/disk2/xen/xen-3.0.4-testing.hg/dist/install/usr/lib/python/xen/xend/XendCheckpoint.py", line 167, in restore
    forkHelper(cmd, fd, handler.handler, True)
File "/disk2/xen/xen-3.0.4-testing.hg/dist/install/usr/lib/python/xen/xend/XendCheckpoint.py", line 235, in forkHelper
    raise XendError("%s failed" % string.join(cmd))
XendError: /usr/lib/xen/bin/xc_restore 4 1 133120 1 2 failed


_______________________________________________
Xen-devel mailing list
Xen-devel@xxxxxxxxxxxxxxxxxxx
http://lists.xensource.com/xen-devel

<Prev in Thread] Current Thread [Next in Thread>