[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [PATCH] CI: Clean up alpine containers


  • To: Anthony Perard <anthony.perard@xxxxxxxxxx>
  • From: Andrew Cooper <Andrew.Cooper3@xxxxxxxxxx>
  • Date: Fri, 18 Feb 2022 14:59:24 +0000
  • Accept-language: en-GB, en-US
  • Arc-authentication-results: i=1; mx.microsoft.com 1; spf=pass smtp.mailfrom=citrix.com; dmarc=pass action=none header.from=citrix.com; dkim=pass header.d=citrix.com; arc=none
  • Arc-message-signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-AntiSpam-MessageData-ChunkCount:X-MS-Exchange-AntiSpam-MessageData-0:X-MS-Exchange-AntiSpam-MessageData-1; bh=3EwvVF/KPonXNYvygcxvUjcnvz4g9uUo0FHyIlWlzu8=; b=E1+RgojrsPkMG1mIbePP696GTBTeCwWB3ZHf4K+dIKhtnvSL2ztnEhqDSp+ozaWyeTIKR1Yq7fo6evznh7FsZLc2RRJBcJXm7CTdICkBwyxKx/QPuiuvUaNmEe2CfTnK9zg2HPt9Dcs/WpS3e3ur08bkgE1qzwzfx0+YhywfYgC25w2mICQHIIi02kULOtpKILq/eFLx26itTDrCUrRgCy8htzomuN97U6IjZC8NSmzW+n3zyR8hO6IsTlW7za0bbos+aCS4AxLIGNGLfYsMRFXzFXLf/k99PcH41+sOtY59SNgw/dbKlYOPk270GIxnMI0CavHdl+L/fAJVpVCcbA==
  • Arc-seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=DvbFzb1UaZ2osiCQYWZSrKuSxSKnz/B0U+6iA/oFLohgr1wNTmK/OsQcaTuD+zLy+312UWsM+pBrbieaMJ+QB5ExrwCDHeJ7VNmqWDr5nO29VMX0oNCI7S1n9DDzcSzCjqIY55GRoTgpwFBUcOye06+ACfngDDqf4AdmQwKEn0tMLNE4Xu4Q41LxKXH9Pa9912k9QRPVYDQXmn8aYwIrcQYtwWvbev9vxmvM8eTR5mvUtUisgtkDe0ujJgIFtuyJOlQPdPPbQCSFHLONvmgC+mleGA7O4Y8utB4OkNHt5R2IVNYi5mf1Un6DhE4qzHMBQhRvoFi2caCP4dhFS22tbA==
  • Authentication-results: esa4.hc3370-68.iphmx.com; dkim=pass (signature verified) header.i=@citrix.onmicrosoft.com
  • Cc: Xen-devel <xen-devel@xxxxxxxxxxxxxxxxxxxx>, Doug Goldstein <cardoe@xxxxxxxxxx>, Wei Liu <wl@xxxxxxx>, Roger Pau Monne <roger.pau@xxxxxxxxxx>, Stefano Stabellini <sstabellini@xxxxxxxxxx>
  • Delivery-date: Fri, 18 Feb 2022 14:59:40 +0000
  • Ironport-data: A9a23:N2YvuaDxtNIQtRVW/wPjw5YqxClBgxIJ4kV8jS/XYbTApDkngzxUn GMbC2+BPPmPa2qme9AiOtzgphxTuZOAydFlQQY4rX1jcSlH+JHPbTi7wuYcHM8wwunrFh8PA xA2M4GYRCwMZiaA4E/raNANlFEkvU2ybuOU5NXsZ2YhFWeIdA970Ug5w7Rg0tYy6TSEK1jlV e3a8pW31GCNg1aYAkpMg05UgEoy1BhakGpwUm0WPZinjneH/5UmJMt3yZWKB2n5WuFp8tuSH I4v+l0bElTxpH/BAvv9+lryn9ZjrrT6ZWBigVIOM0Sub4QrSoXfHc/XOdJFAXq7hQllkPhS4 5Jx67C+ETwMJ43Cn8I2DzpRHR9HaPguFL/veRBTsOSWxkzCNXDt3+9vHAc9OohwFuRfWD8Us 6ZCcXZUM07F17neLLGTE4GAguwKKsXxMZxZkXZn1TzDVt4tQIzZQrWM7thdtNs1rp4QQqiOP 5tHAdZpRDKHeABNHm0XMZIZmbylmVauQg17tU3A8MLb5ECMlVcsgdABKuH9ZdiiVchT2EGCq Qru5H/lCxsXMNiezzut8X+2gOLL2yThV+o6BLC+s/JnnlCX7mgSEwENE0u2p+GjjUyzUM4ZL FYbkgIlpLI+80GDRdDnUxq15nWDu3Yht8F4SrNgrlvXk+yNvljfVjNsoiN9hMIOl+RxeWIG2 AeyusrJCztJm+G2QHyjz+LBxd+tAhQ9IWgHbC4CaAIK5dj/vY0+5i7yosZf/L2d1YOsR2ypq 9yehG1n3uhI05ZXv0mu1Q2f21qRSo71ohnZD+k9dkas9UtHaYGsfOREAnCLvK8bfO51orRs1 UXoevRyDshSXPlhdwTXGY3h+Y1FAd7cYVUwZnY1QvEcG8yFoSLLQGypyGgWyL1VGsgFYyT1R 0TYpBlc4pReVFPzM/MqOtjtU5V2k/G6fTgAahwyRoMfCnSWXFXalByCmGbKhzy9+KTSufpX1 WinnTaEUi9BVPUPIMueTOYBy747rh3SNkuILa0XOy+PiOLEDFbMEO9tGALXMogRsfPVyC2Io o03H5bblH1ivBjWP3C/HXg7dgtRcxDWxPne9qRqSwJ0ClA4QDp4U6eJm9vMueVNxsxoqwsBx VnkMmdww1vjn3zXbwKMb3FocrT0Wphj63k8OEQR0ZyAghDPva7HAH8jSqYK
  • Ironport-hdrordr: A9a23:lciHDKGsRBZQiltzpLqFTJHXdLJyesId70hD6qkvc3Nom52j+/ xGws536fatskdtZJkh8erwXZVp2RvnhNBICPoqTMuftW7dySqVxeBZnMTfKljbdREWmdQtrJ uIH5IOa+EYSGIK9/oSgzPIU+rIouP3iJxA7N22pxwGLGFXguNbnnxE426gYxdLrWJ9dP4E/e +nl6x6Tk2bCBMqh6qAdxs4dtmGg+eOuIPtYBYACRJiwhKJlymU5LnzFAXd9gsCUhtUqI1Ssl Ttokjc3OGOovu7whjT2yv49JJNgubszdNFGYilltUVEDPxkQylDb4RGYFq/QpF5d1H2mxa1+ UkkC1QefibLEmhJ11dlCGdnzUIFgxes0MKh2Xo2kcL6vaJOw7SQ/Ax+76xNCGptnbI9esMoJ 6ilQiixutqJAKFkyLn69fSURZ20kKyvHo5iOYWy2dSSI0EddZq3MYiFW5uYd899RjBmcsa+S hVfbXhzecTdUnfY2HSv2FpztDpVnMvHg2eSkxHvsCOyTBZkH1w0kNdnaUk7zs93YN4T4MB6/ XPM6xumr0LRsgKbbhlDONERcesEGTCTR/FLWrXK1X6E6MMPW7LtvfMkfgIzfDvfIZNwIo5mZ zHXl8dvWkue1j2AcnLx5FP+gClehT1Yd0s8LAp23FUgMyPeFPbC1z1dLl1qbrSnxw2OLyvZ8 qO
  • List-id: Xen developer discussion <xen-devel.lists.xenproject.org>
  • Thread-index: AQHYJMoUO4xl86uH0E6ly+blvuakQqyZYuyAgAADuwA=
  • Thread-topic: [PATCH] CI: Clean up alpine containers

On 18/02/2022 14:46, Anthony PERARD wrote:
> On Fri, Feb 18, 2022 at 01:18:11PM +0000, Andrew Cooper wrote:
>>  * `apk --no-cache` is the preferred way of setting up containers, and it 
>> does
>>    shrink the image by a few MB.
>>  * Neither container needs curl-dev.
>>  * Flex and bison are needed for Xen, so move to the Xen block.
>>
>> Signed-off-by: Andrew Cooper <andrew.cooper3@xxxxxxxxxx>
>> ---
>> CC: Doug Goldstein <cardoe@xxxxxxxxxx>
>> CC: Wei Liu <wl@xxxxxxx>
>> CC: Anthony PERARD <anthony.perard@xxxxxxxxxx>
>> CC: Roger Pau Monné <roger.pau@xxxxxxxxxx>
>> CC: Stefano Stabellini <sstabellini@xxxxxxxxxx>
>>
>> I've already rebuilt the containers and confirmed that the build is still 
>> fine.
>> ---
>> diff --git a/automation/build/alpine/3.12-arm64v8.dockerfile 
>> b/automation/build/alpine/3.12-arm64v8.dockerfile
>> index a1ac9605959e..006cdb3668b3 100644
>> --- a/automation/build/alpine/3.12-arm64v8.dockerfile
>> +++ b/automation/build/alpine/3.12-arm64v8.dockerfile
>> @@ -8,46 +8,39 @@ RUN mkdir /build
>>  WORKDIR /build
>>  
>>  # build depends
>> -RUN \
>> -  # apk
>> -  apk update && \
>> +RUN apk --no-cache add \
>>    \
>>    # xen build deps
>> -  apk add argp-standalone && \
>> -  apk add autoconf && \
>> -  apk add automake && \
>> -  apk add bash && \
>> -  apk add curl && \
>> -  apk add curl-dev && \
>> -  apk add dev86 && \
>> -  apk add dtc-dev && \
>> -  apk add gcc  && \
>> +  argp-standalone \
>> +  autoconf \
>> +  automake \
> Since you are removing some other pkgs, I don't think "automake" is
> needed either. We only use "autoconf" and "autoheader". (Maybe the
> automake pkg give access to something we need, but I'm not sure about
> that.)

Very good observation.  We don't have automake in any other containers. 
I'll strip it out and double check the resulting build.

> In any case, changes looks good:
> Reviewed-by: Anthony PERARD <anthony.perard@xxxxxxxxxx>

Thanks.

~Andrew

 


Rackspace

Lists.xenproject.org is hosted with RackSpace, monitoring our
servers 24x7x365 and backed by RackSpace's Fanatical Support®.