Compare commits

...

14 Commits

Author SHA1 Message Date
Andrius Štikonas 4148b5da72
Merge pull request #222 from fosslinux/python
Python
2023-01-18 18:46:27 +00:00
Andrius Štikonas 60973abb90
Merge pull request #226 from pder/remove-make-3.80
Eliminate make 3.80 and use 3.82 instead when building with tcc
2023-01-18 18:45:38 +00:00
Paul Dersey 91c168bb7d Eliminate make 3.80 and use 3.82 instead when building with tcc
Unlike make 3.80, make 3.82 makes a call to putenv which does not exist
in mes libc so a stub was created for this function.

The checksum for the util-linux package required an update.
When built with original make 3.80 it resulted in an extra file
/usr/share/man/man8/.8 that does not exist when building with 3.82
2023-01-18 11:40:50 -05:00
fosslinux 6ec368ce37 Change Python -> python
(lowercase convention)
2023-01-18 22:57:04 +11:00
fosslinux a4d1a445ac Add documentation 2023-01-18 08:20:35 +11:00
fosslinux d0a522113f Fix QEMU/chroot differences in outputs for Python 2023-01-18 08:20:34 +11:00
fosslinux 42fa6c24c1 Add Python-3.11.1 2023-01-18 08:20:34 +11:00
fosslinux d28ea08295 Add Python-3.8.16 2023-01-18 08:20:34 +11:00
fosslinux d1d422abc5 Add Python-3.4.10 2023-01-18 08:20:34 +11:00
fosslinux f34defc485 Add Python-3.3.7 2023-01-18 08:20:34 +11:00
fosslinux 2325df7f38 Add Python-3.1.5 2023-01-18 08:20:34 +11:00
fosslinux eadc92cf38 Add Python-2.5.6 2023-01-18 08:20:34 +11:00
fosslinux a94c8dbdce Add Python-2.3.7 2023-01-18 08:20:34 +11:00
fosslinux 13eaba86e8 Add Python-2.0.1 2023-01-18 08:20:34 +11:00
56 changed files with 2880 additions and 12 deletions

47
LICENSES/PSF-2.0.txt Normal file
View File

@ -0,0 +1,47 @@
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
1. This LICENSE AGREEMENT is between the Python Software Foundation
("PSF"), and the Individual or Organization ("Licensee") accessing and
otherwise using this software ("Python") in source or binary form and
its associated documentation.
2. Subject to the terms and conditions of this License Agreement, PSF hereby
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
analyze, test, perform and/or display publicly, prepare derivative works,
distribute, and otherwise use Python alone or in any derivative version,
provided, however, that PSF's License Agreement and PSF's notice of copyright,
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation;
All Rights Reserved" are retained in Python alone or in any derivative version
prepared by Licensee.
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python.
4. PSF is making Python available to Licensee on an "AS IS"
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. Nothing in this License Agreement shall be deemed to create any
relationship of agency, partnership, or joint venture between PSF and
Licensee. This License Agreement does not grant permission to use PSF
trademarks or trade name in a trademark sense to endorse or promote
products or services of Licensee, or any third party.
8. By copying, installing or otherwise using Python, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.

193
LICENSES/Python-2.0.1.txt Normal file
View File

@ -0,0 +1,193 @@
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
--------------------------------------------
1. This LICENSE AGREEMENT is between the Python Software Foundation
("PSF"), and the Individual or Organization ("Licensee") accessing and
otherwise using this software ("Python") in source or binary form and
its associated documentation.
2. Subject to the terms and conditions of this License Agreement, PSF hereby
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
analyze, test, perform and/or display publicly, prepare derivative works,
distribute, and otherwise use Python alone or in any derivative version,
provided, however, that PSF's License Agreement and PSF's notice of copyright,
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation;
All Rights Reserved" are retained in Python alone or in any derivative version
prepared by Licensee.
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python.
4. PSF is making Python available to Licensee on an "AS IS"
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. Nothing in this License Agreement shall be deemed to create any
relationship of agency, partnership, or joint venture between PSF and
Licensee. This License Agreement does not grant permission to use PSF
trademarks or trade name in a trademark sense to endorse or promote
products or services of Licensee, or any third party.
8. By copying, installing or otherwise using Python, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.
BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
-------------------------------------------
BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
Individual or Organization ("Licensee") accessing and otherwise using
this software in source or binary form and its associated
documentation ("the Software").
2. Subject to the terms and conditions of this BeOpen Python License
Agreement, BeOpen hereby grants Licensee a non-exclusive,
royalty-free, world-wide license to reproduce, analyze, test, perform
and/or display publicly, prepare derivative works, distribute, and
otherwise use the Software alone or in any derivative version,
provided, however, that the BeOpen Python License is retained in the
Software, alone or in any derivative version prepared by Licensee.
3. BeOpen is making the Software available to Licensee on an "AS IS"
basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
5. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
6. This License Agreement shall be governed by and interpreted in all
respects by the law of the State of California, excluding conflict of
law provisions. Nothing in this License Agreement shall be deemed to
create any relationship of agency, partnership, or joint venture
between BeOpen and Licensee. This License Agreement does not grant
permission to use BeOpen trademarks or trade names in a trademark
sense to endorse or promote products or services of Licensee, or any
third party. As an exception, the "BeOpen Python" logos available at
http://www.pythonlabs.com/logos.html may be used according to the
permissions granted on that web page.
7. By copying, installing or otherwise using the software, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.
CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
---------------------------------------
1. This LICENSE AGREEMENT is between the Corporation for National
Research Initiatives, having an office at 1895 Preston White Drive,
Reston, VA 20191 ("CNRI"), and the Individual or Organization
("Licensee") accessing and otherwise using Python 1.6.1 software in
source or binary form and its associated documentation.
2. Subject to the terms and conditions of this License Agreement, CNRI
hereby grants Licensee a nonexclusive, royalty-free, world-wide
license to reproduce, analyze, test, perform and/or display publicly,
prepare derivative works, distribute, and otherwise use Python 1.6.1
alone or in any derivative version, provided, however, that CNRI's
License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
1995-2001 Corporation for National Research Initiatives; All Rights
Reserved" are retained in Python 1.6.1 alone or in any derivative
version prepared by Licensee. Alternately, in lieu of CNRI's License
Agreement, Licensee may substitute the following text (omitting the
quotes): "Python 1.6.1 is made available subject to the terms and
conditions in CNRI's License Agreement. This Agreement together with
Python 1.6.1 may be located on the internet using the following
unique, persistent identifier (known as a handle): 1895.22/1013. This
Agreement may also be obtained from a proxy server on the internet
using the following URL: http://hdl.handle.net/1895.22/1013".
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python 1.6.1 or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python 1.6.1.
4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. This License Agreement shall be governed by the federal
intellectual property law of the United States, including without
limitation the federal copyright law, and, to the extent such
U.S. federal law does not apply, by the law of the Commonwealth of
Virginia, excluding Virginia's conflict of law provisions.
Notwithstanding the foregoing, with regard to derivative works based
on Python 1.6.1 that incorporate non-separable material that was
previously distributed under the GNU General Public License (GPL), the
law of the Commonwealth of Virginia shall govern this License
Agreement only as to issues arising under or with respect to
Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
License Agreement shall be deemed to create any relationship of
agency, partnership, or joint venture between CNRI and Licensee. This
License Agreement does not grant permission to use CNRI trademarks or
trade name in a trademark sense to endorse or promote products or
services of Licensee, or any third party.
8. By clicking on the "ACCEPT" button where indicated, or by copying,
installing or otherwise using Python 1.6.1, Licensee agrees to be
bound by the terms and conditions of this License Agreement.
ACCEPT
CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
--------------------------------------------------
Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
The Netherlands. All rights reserved.
Permission to use, copy, modify, and distribute this software and its
documentation for any purpose and without fee is hereby granted,
provided that the above copyright notice appear in all copies and that
both that copyright notice and this permission notice appear in
supporting documentation, and that the name of Stichting Mathematisch
Centrum or CWI not be used in advertising or publicity pertaining to
distribution of the software without specific, written prior
permission.
STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

131
parts.rst
View File

@ -100,7 +100,7 @@ using older versions compilable by tinycc. Prior to this point, all tools
have been adapted significantly for the bootstrap; now, we will be using
old tooling instead.
make 3.80
make 3.82
=========
GNU ``make`` is now built so we have a more robust building system.
@ -535,9 +535,8 @@ that we do not have available.
make 3.82
=========
GNU Make is updated by .02. The most notable thing is this is now built properly
using the build system and GCC, which means that it does not randomly segfault
while building the Linux kernel.
GNU ``make`` is now rebuilt properly using the build system and GCC, which means that
it does not randomly segfault while building the Linux kernel.
kexec-tools 2.0.22
==================
@ -858,3 +857,127 @@ musl 1.2.3
With GCC and binutils supporting a musl-based toolchain natively, musl itself is rebuilt
with support for dynamic linking.
python 2.0.1
============
Everything is in place to bootstrap the useful programming language/utility
Python. While Python is largely written in C, many parts of the codebase are
generated from Python scripts, which only increases as Python matured over time.
We begin with Python 2.0.1, which has minimal generated code, most of which can
be removed. Lib/{keyword,token,symbol} scripts are rewritten in C and used to
regenerate parts of the standard library. Unicode support and sre (regex)
support is stripped out.
Using the stripped-down first version of Python 2.0.1, Python 2.0.1 is rebuilt,
including Unicode and regex support (required for future Python builds). The
first version is insufficient to run the Lib/{keyword,token,symbol} scripts, so
those continue to use the C versions.
Precompiled Python code at this point is highly unreproducible, so it is
deleted (JIT compiled instead). This makes Python itself slower, but this is of
little consequence.
python 2.3.7
============
Python 2.0.1 is sufficient to build Python 2.3.7.
Differences to 2.0.1:
* The new "ast" module, performing parsing of Python, is generated from a
parsing specification using Python code.
* 2.0.1 is insufficient to run 2.3.7's unicode regeneration, so Unicode
support is again stripped out.
Python 2.3.7 is then rebuilt to include Unicode support.
python 2.5.6
============
Python 2.3.7 is sufficient to build Python 2.5.6, with a few minimal changes to
language constructs in scripts. This is the last 2.x version we build.
Differences to 2.3.7 are very minimal.
python 3.1.5
============
Python 2.5.6 is new enough to be able to build Python 3.1.5, allowing us to move
into the modern 3.x series of Python. Various patching is required, as some
scripts in the tree are still Python 2 while others are Python 3. We have to
convert the Python 3 ones back to Python 2 to be able to use Python 2.5.6.
Differences to 2.5.6:
* An include cycle when a distributed file is removed arises, we have to jump
through some hoops to make this work.
* At the second pass of building, various charset encodings can be regenerated &
used in the standard library (required in future Python 3.x).
* The new ssl Python library is disabled due to our OpenSSL version being too
new.
Python 3.1.5 is rebuilt, using Python 3 for the Python 3 scripts in the tree.
python 3.3.7
============
Python 3.1.5 is sufficient to build Python 3.3.7 (rapid language change = small
jumps).
Differences to 3.1.5:
* The ssl Python library can now be re-enabled, and ``_ssl_data.h`` regenerated.
python 3.4.10
=============
Python 3.3.7 is sufficient to build Python 3.4.10.
Differences to 3.3.7:
* The clinic tool has been introduced, which unifies documentation with code.
Clinic creates many generated files. We run the clinic tool across all files
using clinic.
* The ssl library breaks in much more ugly ways than before, but unlike previous
versions, it passes over this error silently.
python 3.8.16
=============
Python 3.4.10 is sufficient to build Python 3.8.16.
Differences to 3.4.10:
* The build system has been significantly revamped (coming in line with modern
standards).
* Many of our previous regenerations can be replaced with one ``make regen-all``
invocation.
* The stringprep Python module, previously deleted, is now required, so it is
regenerated.
python 3.11.1
=============
The newest version of Python, Python 3.11.1 can now be built.
Differences to 3.8.16:
* Unfortunately, the build system has regressed slightly. We must choose the
order to perform regenerations in the Makefile ourselves, as some
regenerations use other regenerations, but the Makefile does not include them
as dependencies.
* The concept of "frozen" modules has been introduced, adding a layer of
complexity to regeneration.
* ``stdlib_module_names.h`` is a new file that must be built using data from a
current Python binary. To achieve this, a dummy ``stdlib_module_names.h`` is used
for the build, then ``stdlib_module_names.h`` is created, and Python is
rebuilt using the proper ``stdlib_module_names.h``. Unfortunately this
greatly increases the time taken to build Python, but it is not trivial to
work around.
* A new generated script ``Lib/re/_casefix.py`` is introduced.
* The ssl module, now unbroken, can be built again.
* Very recent Python versions allow for the use of ``SOURCE_DATE_EPOCH`` to
remove determinism from precompiled Python libraries (``.pyc``). Finally, we
can re-enable compiling of Python modules.

View File

@ -91,6 +91,17 @@ df12820e27abfe07c4c27bb2f9abf2e0758b797d5d3036e29d6c57cfb5aa12d6 openssl-1.1.1l
31eda69af533e26b0cae543e02f2024fc2663dc47605f57fa58652117cbc1460 perl-5.32.1_0.tar.bz2
9ceb09af82397f98e99e339cb4fd3abd9f61d222ea7e6a0920e2f3a7c316c70a perl-5.6.2_0.tar.bz2
c69e0197ebc1bf9f9fc68a06d4c649c934784077058c24a484da59a153132816 pkg-config-0.29.2_0.tar.bz2
a04ac45d76a5432aa1f1492ec8787dd2834212568f95f65b17f7640892504458 python-2.0.1_0.tar.bz2
bd94e4a3a5d1af32056f096d01982ed36498f75fdc06cff3aa8db8a4917cf0b0 python-2.0.1_1.tar.bz2
143355d64c92d2cfa6bfd9c0f84049d29d972eda3f51249367799402d5f6412a python-2.3.7_0.tar.bz2
b623af7a04a7f6b7b135ec0b0330ac23ba1aab2a4a07b79d77d1a34e37742fed python-2.3.7_1.tar.bz2
83b3af3f7c58fd0ad36834499c00c6ca0e84f8d604236f14e6938a9e73b6d5d1 python-2.5.6_0.tar.bz2
a84c639c2c6a728d2a510074d4a6d0629a17fa6a2374f28f0913b33790b5cb41 python-3.1.5_0.tar.bz2
7c3aab540c9b25d1ad525e9b4cf3ce6affd140394eb4973303a647706b5f7532 python-3.1.5_1.tar.bz2
dbc1cb9db64e2580514f47a642de463b8834dcae9f713dccad49ac5a656b060c python-3.3.7_0.tar.bz2
0f5ea233b0df24a3b0545d2fdcf18a85e23e6a416e00b52172ceafe9d4be97ff python-3.4.10_0.tar.bz2
c483d72106b7ba83f37b1fea49c32a8af5485e685d910c9d805b4089c18bc4c7 python-3.8.16_0.tar.bz2
8b113e7273f5db2cee7ce27e5eba6d0548f1179a778095a95d3ddb45eda6eb0a python-3.11.1_0.tar.bz2
8a0248fbf8fe1764580698415cc3628585d4dd054ddf63040f400e18cbaef7a4 sed-4.0.9_0.tar.bz2
177553732a080e25ba5778525743543e9da012122f4ad0d314a425ca87a3c2bd sed-4.8_0.tar.bz2
f3be04bb46c9ac80180defa46c274214ab00b5b4dd9c8a3a6de162e43ef0fa20 tar-1.34_0.tar.bz2
@ -99,7 +110,7 @@ f3be04bb46c9ac80180defa46c274214ab00b5b4dd9c8a3a6de162e43ef0fa20 tar-1.34_0.tar
db57c6ef39965f0562d2aefe3c06571df50ba1265446d97f2714d80518862cef tcc-0.9.27_2.tar.bz2
e2014b844b1a79cda9142a38af0404efd242ae02f77aa286c968e4ad6ad87265 tcc-0.9.27_3.tar.bz2
0c8b02693dac9483d845e7754919fdf21e97d695e5de13893c1356d0a9c22946 texinfo-6.7_0.tar.bz2
bf4a6be34cda165e4c206e852ccc09387f5ae8ea7db6de2db01297cabfa1a486 util-linux-2.19.1_0.tar.bz2
e3fb8277bec3c93887029d51aea1c53216fee41b8e5be5ff447da1cf543641c6 util-linux-2.19.1_0.tar.bz2
284d176b39312795bf155b794fc3c02070ff788d19307e926429fa3299faf283 which-2.21_0.tar.bz2
e900a8b70f49bfcbb7a48bd27e2de67c30454d693b6f35dcdfadd35570e98e69 xz-5.0.5_0.tar.bz2
068fcf87574883b29734bda3ccc45ef0e2be7aa6fb7e86941c78eb5a4de61389 zlib-1.2.13_0.tar.bz2

View File

@ -442,7 +442,6 @@ populate_device_nodes() {
test -c "/dev/urandom" || mknod -m 444 "/dev/urandom" c 1 9
if [ "${CHROOT}" = False ]; then
test -c "/dev/ptmx" || mknod -m 666 "/dev/ptmx" c 5 2
test -c "/dev/tty" || mknod -m 666 "/dev/tty" c 5 0
test -c "/dev/console" || mknod -m 666 "/dev/console" c 5 1
fi

View File

@ -1 +0,0 @@
8112529259780fe659ba68030d1ba1a64589ece80d0f328523395029827bd41f /usr/bin/make

View File

@ -1 +0,0 @@
https://mirrors.kernel.org/gnu/make/make-3.80.tar.bz2 a99b39e7b04c333724f48c38fede709481cfb69fafe7e32ae4285b7fadf92f1b

View File

@ -0,0 +1,11 @@
/*
* SPDX-FileCopyrightText: 2023 Paul Dersey <pdersey@gmail.com>
*
* SPDX-License-Identifier: GPL-2.0-or-later
*/
int putenv(char *string)
{
return 0;
}

View File

@ -0,0 +1 @@
a7de9406e3adf34577628447696020944f7961d8a9da32c0da930316a05d0710 /usr/bin/make

View File

@ -22,6 +22,9 @@ cd ${pkg}
# Create .h files
catm config.h
# Prepare
cp ../../files/putenv_stub.c putenv_stub.c
# Compile
tcc -c getopt.c
tcc -c getopt1.c
@ -36,22 +39,24 @@ tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -Dvfork=fork function.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART implicit.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -DHAVE_DUP2 -DHAVE_STRCHR -Dvfork=fork job.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -DLOCALEDIR=\"/fake-locale\" -DPACKAGE=\"fake-make\" -DHAVE_MKTEMP -DHAVE_GETCWD main.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -DHAVE_STRERROR -DHAVE_VPRINTF misc.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -DHAVE_STRERROR -DHAVE_VPRINTF -DHAVE_ANSI_COMPILER -DHAVE_STDARG_H misc.c
tcc -c -I. -Iglob -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -DINCLUDEDIR=\"${PREFIX}/include\" read.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART -DFILE_TIMESTAMP_HI_RES=0 -DHAVE_FCNTL_H -DLIBDIR=\"${PREFIX}/lib\" remake.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART rule.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART signame.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART strcache.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART variable.c
tcc -c -I. -DVERSION=\"3.80\" version.c
tcc -c -I. -DVERSION=\"3.82\" version.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART vpath.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART hash.c
tcc -c -I. -DHAVE_INTTYPES_H -DHAVE_SA_RESTART remote-stub.c
tcc -c -DHAVE_FCNTL_H getloadavg.c
tcc -c -Iglob -DSTDC_HEADERS glob/fnmatch.c
tcc -c -Iglob -DHAVE_STRDUP -DHAVE_DIRENT_H glob/glob.c
tcc -c putenv_stub.c
# Link
tcc -static -o ${bindir}/make getopt.o getopt1.o ar.o arscan.o commands.o default.o dir.o expand.o file.o function.o implicit.o job.o main.o misc.o read.o remake.o rule.o signame.o variable.o version.o vpath.o hash.o remote-stub.o getloadavg.o fnmatch.o glob.o
tcc -static -o ${bindir}/make getopt.o getopt1.o ar.o arscan.o commands.o default.o dir.o expand.o file.o function.o implicit.o job.o main.o misc.o read.o remake.o rule.o signame.o strcache.o variable.o version.o vpath.o hash.o remote-stub.o getloadavg.o fnmatch.o glob.o putenv_stub.o
# Test
make --version

View File

@ -43,7 +43,7 @@ kaem --file ${pkg}.kaem
cd ..
# make
pkg="make-3.80"
pkg="make-3.82"
cd ${pkg}
kaem --file ${pkg}.kaem
cd ..

View File

@ -2,6 +2,9 @@
#
# SPDX-License-Identifier: GPL-3.0-or-later
# XXX: If you change the version of this, you must update the corresponding
# tarball in Python 3.11.
src_prepare() {
default

View File

@ -0,0 +1,107 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: Python-2.0.1
unicodetype_db.h is a file that needs to be regened, but it not
particularly trivial to regen. For the first build of Python,
strip out any kind of unicode support that requires
unicodetype_db.h indiscriminately.
We are effectively restricted to ASCII characters with this change,
but it works.
--- Objects/unicodectype.c 2000-09-26 08:48:13.000000000 +1100
+++ Objects/unicodectype.c 2022-10-03 21:09:02.108869321 +1100
@@ -29,30 +29,12 @@
const unsigned char digit;
} _PyUnicode_TypeRecord;
-#include "unicodetype_db.h"
-
-static const _PyUnicode_TypeRecord *
-gettyperecord(int code)
-{
- int index;
-
- if (code < 0 || code >= 65536)
- index = 0;
- else {
- index = index1[(code>>SHIFT)];
- index = index2[(index<<SHIFT)+(code&((1<<SHIFT)-1))];
- }
- return &_PyUnicode_TypeRecords[index];
-}
-
/* Returns 1 for Unicode characters having the category 'Zl' or type
'B', 0 otherwise. */
int _PyUnicode_IsLinebreak(register const Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & LINEBREAK_MASK) != 0;
+ return 0;
}
/* Returns the titlecase Unicode characters corresponding to ch or just
@@ -60,12 +44,7 @@
Py_UNICODE _PyUnicode_ToTitlecase(register const Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- if (ctype->title)
- return ch + ctype->title;
-
- return ch + ctype->upper;
+ return ch;
}
/* Returns 1 for Unicode characters having the category 'Lt', 0
@@ -73,9 +52,7 @@
int _PyUnicode_IsTitlecase(register const Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & TITLE_MASK) != 0;
+ return 0;
}
/* Returns the integer decimal (0-9) for Unicode characters having
@@ -83,15 +60,13 @@
int _PyUnicode_ToDecimalDigit(register const Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & DECIMAL_MASK) ? ctype->decimal : -1;
+ return -1;
}
int _PyUnicode_IsDecimalDigit(register const Py_UNICODE ch)
{
if (_PyUnicode_ToDecimalDigit(ch) < 0)
- return 0;
+ return 0;
return 1;
}
@@ -100,15 +75,13 @@
int _PyUnicode_ToDigit(register const Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & DIGIT_MASK) ? ctype->digit : -1;
+ return -1;
}
int _PyUnicode_IsDigit(register const Py_UNICODE ch)
{
if (_PyUnicode_ToDigit(ch) < 0)
- return 0;
+ return 0;
return 1;
}

View File

@ -0,0 +1,45 @@
/*
* SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
*
* SPDX-License-Identifier: Python-2.0.1
*
* Reimplmentation of keyword.py main() in C, to break bootstrapping loop
*/
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define MAX_LINE 128
int main() {
char filename[] = "Lib/keyword.py";
FILE *orig = fopen(filename, "r");
/* Read-write until starter line */
char *line = malloc(MAX_LINE);
do {
fgets(line, MAX_LINE, orig);
puts(line);
} while (strcmp(line, "#--start keywords--\n") != 0);
/* Perform the actual transformation */
while (fgets(line, MAX_LINE, stdin) != NULL) {
char *token = line;
while (*token != '"') token++;
token++;
/* Now at beginning of keyword */
char *end = token;
while (*end != '"') end++;
*end = '\0';
/* Write output line to stdout */
printf("'%s',\n", token);
/* For each line also advance orig pointer */
fgets(line, MAX_LINE, orig);
/* Cleanup */
free(line);
line = malloc(MAX_LINE);
}
/* Read-write until end */
while (fgets(line, MAX_LINE, orig) != NULL) {
puts(line);
}
}

View File

@ -0,0 +1,46 @@
/*
* SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
*
* SPDX-License-Identifier: Python-2.0.1
*
* Reimplmentation of token.py main() in C, to break bootstrapping loop
*/
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define MAX_LINE 128
int main(int argc, char** argv) {
char *filename = argv[1];
FILE *orig = fopen(filename, "r");
/* Read-write until starter line */
char *line = malloc(MAX_LINE);
do {
fgets(line, MAX_LINE, orig);
puts(line);
} while (strcmp(line, "#--start constants--\n") != 0);
/* Perform the actual transformation */
while (fgets(line, MAX_LINE, stdin) != NULL) {
/* Transform input into output */
char *tokena = line + 8;
char *tokenb = strstr(tokena, "\t");
if (tokenb == 0) tokenb = strstr(tokena, " ");
*tokenb = '\0';
tokenb++;
while (*tokenb == '\t' || *tokenb == ' ') tokenb++;
/* Write output line to stdout */
printf("%s = %s", tokena, tokenb);
/* For each line also advance orig pointer */
fgets(line, MAX_LINE, orig);
/* Cleanup */
free(line);
line = malloc(MAX_LINE);
}
/* Read-write until end */
while (fgets(line, MAX_LINE, orig) != NULL) {
puts(line);
fflush(stdout);
}
}

View File

@ -0,0 +1,206 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: Python-2.0.1
Python 2.0 does not support DESTDIR, so add it in.
--- Makefile.in 2022-10-21 17:56:48.034287578 +1100
+++ Makefile.in 2022-10-21 18:07:54.267542882 +1100
@@ -224,16 +224,16 @@
# Install the interpreter (by creating a hard link to python$(VERSION))
bininstall: altbininstall
- -if test -f $(BINDIR)/python$(EXE); \
- then rm -f $(BINDIR)/python$(EXE); \
+ -if test -f $(DESTDIR)$(BINDIR)/python$(EXE); \
+ then rm -f $(DESTDIR)$(BINDIR)/python$(EXE); \
else true; \
fi
- (cd $(BINDIR); $(LN) python$(VERSION)$(EXE) python$(EXE))
+ (cd $(DESTDIR)$(BINDIR); $(LN) python$(VERSION)$(EXE) python$(EXE))
# Install the interpreter with $(VERSION) affixed
# This goes into $(exec_prefix)
altbininstall: python$(EXE)
- @for i in $(BINDIR); \
+ @for i in $(DESTDIR)$(BINDIR); \
do \
if test ! -d $$i; then \
echo "Creating directory $$i"; \
@@ -242,15 +242,15 @@
else true; \
fi; \
done
- $(INSTALL_PROGRAM) python$(EXE) $(BINDIR)/python$(VERSION)$(EXE)
+ $(INSTALL_PROGRAM) python$(EXE) $(DESTDIR)$(BINDIR)/python$(VERSION)$(EXE)
if test -f libpython$(VERSION).so; then \
- $(INSTALL_DATA) libpython$(VERSION).so $(LIBDIR); \
+ $(INSTALL_DATA) libpython$(VERSION).so $(DESTDIR)$(LIBDIR); \
else true; \
fi
# Install the manual page
maninstall:
- @for i in $(MANDIR) $(MANDIR)/man1; \
+ @for i in $(DESTDIR)$(MANDIR) $(DESTDIR)$(MANDIR)/man1; \
do \
if test ! -d $$i; then \
echo "Creating directory $$i"; \
@@ -260,7 +260,7 @@
fi; \
done
$(INSTALL_DATA) $(srcdir)/Misc/python.man \
- $(MANDIR)/man1/python.1
+ $(DESTDIR)$(MANDIR)/man1/python.1
# Install the library
PLATDIR= plat-$(MACHDEP)
@@ -269,7 +269,7 @@
LIBSUBDIRS= lib-old lib-tk site-packages test test/output encodings \
distutils distutils/command $(XMLLIBSUBDIRS) curses $(MACHDEPS)
libinstall: python $(srcdir)/Lib/$(PLATDIR)
- @for i in $(SCRIPTDIR) $(LIBDEST); \
+ @for i in $(DESTDIR)$(SCRIPTDIR) $(DESTDIR)$(LIBDEST); \
do \
if test ! -d $$i; then \
echo "Creating directory $$i"; \
@@ -278,11 +278,11 @@
else true; \
fi; \
done
- @for d in $(LIBSUBDIRS); \
+ @for d in $(DESTDIR)$(LIBSUBDIRS); \
do \
a=$(srcdir)/Lib/$$d; \
if test ! -d $$a; then continue; else true; fi; \
- b=$(LIBDEST)/$$d; \
+ b=$(DESTDIR)$(LIBDEST)/$$d; \
if test ! -d $$b; then \
echo "Creating directory $$b"; \
mkdir $$b; \
@@ -293,18 +293,18 @@
@for i in $(srcdir)/Lib/*.py $(srcdir)/Lib/*.doc; \
do \
if test -x $$i; then \
- $(INSTALL_PROGRAM) $$i $(LIBDEST); \
- echo $(INSTALL_PROGRAM) $$i $(LIBDEST); \
+ $(INSTALL_PROGRAM) $$i $(DESTDIR)$(LIBDEST); \
+ echo $(INSTALL_PROGRAM) $$i $(DESTDIR)$(LIBDEST); \
else \
- $(INSTALL_DATA) $$i $(LIBDEST); \
- echo $(INSTALL_DATA) $$i $(LIBDEST); \
+ $(INSTALL_DATA) $$i $(DESTDIR)$(LIBDEST); \
+ echo $(INSTALL_DATA) $$i $(DESTDIR)$(LIBDEST); \
fi; \
done
@for d in $(LIBSUBDIRS); \
do \
a=$(srcdir)/Lib/$$d; \
if test ! -d $$a; then continue; else true; fi; \
- b=$(LIBDEST)/$$d; \
+ b=$(DESTDIR)$(LIBDEST)/$$d; \
for i in $$a/*; \
do \
case $$i in \
@@ -324,11 +324,11 @@
esac; \
done; \
done
- $(INSTALL_DATA) $(srcdir)/LICENSE $(LIBDEST)/LICENSE.txt
- PYTHONPATH=$(LIBDEST) \
- ./python$(EXE) -tt $(LIBDEST)/compileall.py $(LIBDEST)
- PYTHONPATH=$(LIBDEST) \
- ./python$(EXE) -O $(LIBDEST)/compileall.py $(LIBDEST)
+ $(INSTALL_DATA) $(srcdir)/LICENSE $(DESTDIR)$(LIBDEST)/LICENSE.txt
+ PYTHONPATH=$(DESTDIR)$(LIBDEST) \
+ ./python$(EXE) -tt $(DESTDIR)$(LIBDEST)/compileall.py $(DESTDIR)$(LIBDEST)
+ PYTHONPATH=$(DESTDIR)$(LIBDEST) \
+ ./python$(EXE) -O $(DESTDIR)$(LIBDEST)/compileall.py $(DESTDIR)$(LIBDEST)
# Create the PLATDIR source directory, if one wasn't distributed..
$(srcdir)/Lib/$(PLATDIR):
@@ -344,25 +344,25 @@
inclinstall:
@for i in $(INCLDIRSTOMAKE); \
do \
- if test ! -d $$i; then \
- echo "Creating directory $$i"; \
- mkdir $$i; \
- chmod $(DIRMODE) $$i; \
+ if test ! -d $(DESTDIR)$$i; then \
+ echo "Creating directory $(DESTDIR)$$i"; \
+ mkdir $(DESTDIR)$$i; \
+ chmod $(DIRMODE) $(DESTDIR)$$i; \
else true; \
fi; \
done
@for i in $(srcdir)/Include/*.h; \
do \
- echo $(INSTALL_DATA) $$i $(INCLUDEPY); \
- $(INSTALL_DATA) $$i $(INCLUDEPY); \
+ echo $(INSTALL_DATA) $$i $(DESTDIR)$(INCLUDEPY); \
+ $(INSTALL_DATA) $$i $(DESTDIR)$(INCLUDEPY); \
done
- $(INSTALL_DATA) config.h $(CONFINCLUDEPY)/config.h
+ $(INSTALL_DATA) config.h $(DESTDIR)$(CONFINCLUDEPY)/config.h
# Install the library and miscellaneous stuff needed for extending/embedding
# This goes into $(exec_prefix)
LIBPL= $(LIBP)/config
libainstall: all
- @for i in $(LIBDIR) $(LIBP) $(LIBPL); \
+ @for i in $(DESTDIR)$(LIBDIR) $(DESTDIR)$(LIBP) $(DESTDIR)$(LIBPL); \
do \
if test ! -d $$i; then \
echo "Creating directory $$i"; \
@@ -372,19 +372,19 @@
fi; \
done
@if test -d $(LIBRARY); then :; else \
- $(INSTALL_DATA) $(LIBRARY) $(LIBPL)/$(LIBRARY) ; \
- $(RANLIB) $(LIBPL)/$(LIBRARY) ; \
+ $(INSTALL_DATA) $(LIBRARY) $(DESTDIR)$(LIBPL)/$(LIBRARY) ; \
+ $(RANLIB) $(DESTDIR)$(LIBPL)/$(LIBRARY) ; \
fi
- $(INSTALL_DATA) Modules/config.c $(LIBPL)/config.c
- $(INSTALL_DATA) Modules/python.o $(LIBPL)/python.o
- $(INSTALL_DATA) $(srcdir)/Modules/config.c.in $(LIBPL)/config.c.in
- $(INSTALL_DATA) Modules/Makefile $(LIBPL)/Makefile
- $(INSTALL_DATA) Modules/Setup $(LIBPL)/Setup
- $(INSTALL_DATA) Modules/Setup.local $(LIBPL)/Setup.local
- $(INSTALL_DATA) Modules/Setup.config $(LIBPL)/Setup.config
- $(INSTALL_PROGRAM) $(srcdir)/Modules/makesetup $(LIBPL)/makesetup
- $(INSTALL_PROGRAM) $(srcdir)/install-sh $(LIBPL)/install-sh
- $(INSTALL_DATA) $(srcdir)/Misc/Makefile.pre.in $(LIBPL)/Makefile.pre.in
+ $(INSTALL_DATA) Modules/config.c $(DESTDIR)$(LIBPL)/config.c
+ $(INSTALL_DATA) Modules/python.o $(DESTDIR)$(LIBPL)/python.o
+ $(INSTALL_DATA) $(srcdir)/Modules/config.c.in $(DESTDIR)$(LIBPL)/config.c.in
+ $(INSTALL_DATA) Modules/Makefile $(DESTDIR)$(LIBPL)/Makefile
+ $(INSTALL_DATA) Modules/Setup $(DESTDIR)$(LIBPL)/Setup
+ $(INSTALL_DATA) Modules/Setup.local $(DESTDIR)$(LIBPL)/Setup.local
+ $(INSTALL_DATA) Modules/Setup.config $(DESTDIR)$(LIBPL)/Setup.config
+ $(INSTALL_PROGRAM) $(srcdir)/Modules/makesetup $(DESTDIR)$(LIBPL)/makesetup
+ $(INSTALL_PROGRAM) $(srcdir)/install-sh $(DESTDIR)$(LIBPL)/install-sh
+ $(INSTALL_DATA) $(srcdir)/Misc/Makefile.pre.in $(DESTDIR)$(LIBPL)/Makefile.pre.in
@if [ -s Modules/python.exp -a \
"`echo $(MACHDEP) | sed 's/^\(...\).*/\1/'`" = "aix" ]; then \
echo; echo "Installing support files for building shared extension modules on AIX:"; \
@@ -425,6 +425,7 @@
CCSHARED="$(CCSHARED)" \
LINKFORSHARED="$(LINKFORSHARED)" \
DESTSHARED="$(DESTSHARED)" \
+ DESTDIR="$(DESTDIR)" \
prefix="$(prefix)" \
exec_prefix="$(exec_prefix)" \
sharedinstall
--- Modules/Makefile.pre.in 2022-10-21 17:56:44.635251486 +1100
+++ Modules/Makefile.pre.in 2022-10-21 17:57:00.124415957 +1100
@@ -240,7 +240,7 @@
sharedinstall: $(DESTSHARED) $(SHAREDMODS)
-for i in X $(SHAREDMODS); do \
if test $$i != X; \
- then $(INSTALL_SHARED) $$i $(DESTSHARED)/$$i; \
+ then $(INSTALL_SHARED) $$i $(DESTDIR)$(DESTSHARED)/$$i; \
fi; \
done

View File

@ -0,0 +1,33 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: Python-2.0.1
musl (correctly) implements the POSIX posix_close function, however
this was added well after Python 2.0.1 was released.
--- Modules/posixmodule.c 2022-09-16 16:46:09.809812072 +1000
+++ Modules/posixmodule.c 2022-09-16 16:47:23.254166370 +1000
@@ -3267,12 +3267,12 @@
}
-static char posix_close__doc__[] =
+static char py_posix_close__doc__[] =
"close(fd) -> None\n\
Close a file descriptor (for low level IO).";
static PyObject *
-posix_close(PyObject *self, PyObject *args)
+py_posix_close(PyObject *self, PyObject *args)
{
int fd, res;
if (!PyArg_ParseTuple(args, "i:close", &fd))
@@ -5300,7 +5300,7 @@
{"tcsetpgrp", posix_tcsetpgrp, METH_VARARGS, posix_tcsetpgrp__doc__},
#endif /* HAVE_TCSETPGRP */
{"open", posix_open, METH_VARARGS, posix_open__doc__},
- {"close", posix_close, METH_VARARGS, posix_close__doc__},
+ {"close", py_posix_close, METH_VARARGS, py_posix_close__doc__},
{"dup", posix_dup, METH_VARARGS, posix_dup__doc__},
{"dup2", posix_dup2, METH_VARARGS, posix_dup2__doc__},
{"lseek", posix_lseek, METH_VARARGS, posix_lseek__doc__},

View File

@ -0,0 +1,18 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: Python-2.0.1
Python 2.0.1's Makefile does not support custom CFLAGS for some
reason, so we have to patch our __DATE__ __TIME__ undefs in.
--- Makefile.in 2022-12-23 18:33:56.486325025 +1100
+++ Makefile.in 2022-12-23 18:46:05.910387214 +1100
@@ -127,7 +127,7 @@
DIST= $(DISTFILES) $(DISTDIRS)
# Compilation flags for getbuildinfo.c only
-CFLAGS= $(OPT) -I. $(DEFS)
+CFLAGS= $(OPT) -I. $(DEFS) -U__DATE__ -U__TIME__
LIBRARY= libpython$(VERSION).a
LDLIBRARY= @LDLIBRARY@

View File

@ -0,0 +1,2 @@
https://www.python.org/ftp/python/2.0.1/Python-2.0.1.tgz 98557b819a42d2093b41d8637302d1311b81f627af9ad20036357d7eb2813872
http://ftp.unicode.org/Public/3.0-Update/UnicodeData-3.0.0.txt f41d967bc458ee106f0c3948bfad71cd0860d96c49304e3fd02eaf2bbae4b6d9

69
sysc/python-2.0.1/stage1.sh Executable file
View File

@ -0,0 +1,69 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files
rm Modules/glmodule.c
rm Modules/unicodedata_db.h Objects/unicodetype_db.h
rm Modules/sre_constants.h
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Disable sre and unicodedata modules
sed -i "/^_sre/d" Modules/Setup.in
sed -i "/^unicodedata/d" Modules/Setup.in
# Patch
patch -Np0 -i disable-unicode.patch
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
./configure \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-wctype-functions
}
src_compile() {
# Build pgen
pushd Parser
make pgen
popd
# Regen graminit.c and graminit.h
pushd Grammar
make graminit.c
popd
# Regenerate some Python scripts using the other regenerated files
gcc -o keyword keyword.c
gcc -o token token.c
# This gets all of the grammar tokens
grep -E '\{1, "[^"]+"' Python/graminit.c | ./keyword > Lib/keyword.py.new
mv Lib/keyword.py.new Lib/keyword.py
./token Lib/symbol.py < Include/graminit.h > Lib/symbol.py.new
mv Lib/symbol.py.new Lib/symbol.py
# These get all of the #defines that have to be translated
grep '#define[[:space:]][A-Z]*[[:space:]][[:space:]]*[0-9][0-9]*' Include/token.h | ./token Lib/token.py > Lib/token.py.new
mv Lib/token.py.new Lib/token.py
# Now build the main program
make
}
src_install() {
mkdir -p "${DESTDIR}/usr"
default
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
}

68
sysc/python-2.0.1/stage2.sh Executable file
View File

@ -0,0 +1,68 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files
rm Modules/glmodule.c
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate unicode
rm Modules/unicodedata_db.h Objects/unicodetype_db.h
mv ../UnicodeData-3.0.0.txt UnicodeData-Latest.txt
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
python Lib/sre_constants.py
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
./configure \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl"
}
src_compile() {
# Build pgen
pushd Parser
make pgen
popd
# Regen graminit.c and graminit.h
pushd Grammar
make graminit.c
popd
# Regenerate some Python scripts using the other regenerated files
gcc -o keyword keyword.c
gcc -o token token.c
# This gets all of the grammar tokens
grep -E '\{1, "[^"]+"' Python/graminit.c | ./keyword > Lib/keyword.py.new
mv Lib/keyword.py.new Lib/keyword.py
./token Lib/symbol.py < Include/graminit.h > Lib/symbol.py.new
mv Lib/symbol.py.new Lib/symbol.py
# These get all of the #defines that have to be translated
grep '#define[[:space:]][A-Z]*[[:space:]][[:space:]]*[0-9][0-9]*' Include/token.h | ./token Lib/token.py > Lib/token.py.new
mv Lib/token.py.new Lib/token.py
# Now build the main program
make
}
src_install() {
mkdir -p "${DESTDIR}/usr"
default
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
}

View File

@ -0,0 +1,110 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
This is a nearly-equivalent patch to Python 2.0.1, with nearly
identical reasoning.
Python 2.3's unicode regeneration code is a bit too incompatible
with Python 2.0.1.
--- Objects/unicodectype.c 2022-10-05 18:11:21.989603599 +1100
+++ Objects/unicodectype.c 2022-10-05 18:14:57.335843857 +1100
@@ -29,31 +29,12 @@
const unsigned char digit;
} _PyUnicode_TypeRecord;
-#include "unicodetype_db.h"
-
-static const _PyUnicode_TypeRecord *
-gettyperecord(Py_UNICODE code)
-{
- int index;
-
- if (code >= 0x110000)
- index = 0;
- else {
- index = index1[(code>>SHIFT)];
- index = index2[(index<<SHIFT)+(code&((1<<SHIFT)-1))];
- }
-
- return &_PyUnicode_TypeRecords[index];
-}
-
/* Returns 1 for Unicode characters having the category 'Zl' or type
'B', 0 otherwise. */
int _PyUnicode_IsLinebreak(Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & LINEBREAK_MASK) != 0;
+ return 0;
}
/* Returns the titlecase Unicode characters corresponding to ch or just
@@ -61,18 +42,7 @@
Py_UNICODE _PyUnicode_ToTitlecase(register Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
- int delta;
-
- if (ctype->title)
- delta = ctype->title;
- else
- delta = ctype->upper;
-
- if (delta >= 32768)
- delta -= 65536;
-
- return ch + delta;
+ return ch;
}
/* Returns 1 for Unicode characters having the category 'Lt', 0
@@ -80,9 +50,7 @@
int _PyUnicode_IsTitlecase(Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & TITLE_MASK) != 0;
+ return 0;
}
/* Returns the integer decimal (0-9) for Unicode characters having
@@ -90,9 +58,7 @@
int _PyUnicode_ToDecimalDigit(Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & DECIMAL_MASK) ? ctype->decimal : -1;
+ return -1;
}
int _PyUnicode_IsDecimalDigit(Py_UNICODE ch)
@@ -107,9 +73,7 @@
int _PyUnicode_ToDigit(Py_UNICODE ch)
{
- const _PyUnicode_TypeRecord *ctype = gettyperecord(ch);
-
- return (ctype->flags & DIGIT_MASK) ? ctype->digit : -1;
+ return -1;
}
int _PyUnicode_IsDigit(Py_UNICODE ch)
--- Makefile.pre.in 2005-01-12 00:49:02.000000000 +1100
+++ Makefile.pre.in 2022-10-05 18:35:05.979846971 +1100
@@ -456,8 +456,7 @@
Python/importdl.o: $(srcdir)/Python/importdl.c
$(CC) -c $(PY_CFLAGS) -I$(DLINCLDIR) -o $@ $(srcdir)/Python/importdl.c
-Objects/unicodectype.o: $(srcdir)/Objects/unicodectype.c \
- $(srcdir)/Objects/unicodetype_db.h
+Objects/unicodectype.o: $(srcdir)/Objects/unicodectype.c
############################################################################
# Header files

View File

@ -0,0 +1,33 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
musl (correctly) implements the POSIX posix_close function, however
this was added after Python 2.3.7 was released.
--- Modules/posixmodule.c 2022-10-05 18:38:46.718131893 +1100
+++ Modules/posixmodule.c 2022-10-05 18:39:07.049250312 +1100
@@ -5208,12 +5208,12 @@
}
-PyDoc_STRVAR(posix_close__doc__,
+PyDoc_STRVAR(py_posix_close__doc__,
"close(fd)\n\n\
Close a file descriptor (for low level IO).");
static PyObject *
-posix_close(PyObject *self, PyObject *args)
+py_posix_close(PyObject *self, PyObject *args)
{
int fd, res;
if (!PyArg_ParseTuple(args, "i:close", &fd))
@@ -7371,7 +7371,7 @@
{"tcsetpgrp", posix_tcsetpgrp, METH_VARARGS, posix_tcsetpgrp__doc__},
#endif /* HAVE_TCSETPGRP */
{"open", posix_open, METH_VARARGS, posix_open__doc__},
- {"close", posix_close, METH_VARARGS, posix_close__doc__},
+ {"close", py_posix_close, METH_VARARGS, py_posix_close__doc__},
{"dup", posix_dup, METH_VARARGS, posix_dup__doc__},
{"dup2", posix_dup2, METH_VARARGS, posix_dup2__doc__},
{"lseek", posix_lseek, METH_VARARGS, posix_lseek__doc__},

View File

@ -0,0 +1,3 @@
https://www.python.org/ftp/python/2.3.7/Python-2.3.7.tgz 969a9891dce9f50b13e54f9890acaf2be66715a5895bf9b11111f320c205b90e
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb

70
sysc/python-2.3.7/stage1.sh Executable file
View File

@ -0,0 +1,70 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Remove broken file
rm Lib/test/test_pep263.py
# Delete generated files
rm Modules/glmodule.c
rm Modules/unicodedata_db.h Objects/unicodetype_db.h
rm Lib/stringprep.py
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Disable unicode
patch -Np0 -i disable-unicode.patch
# Regenerate sre_constants.h
rm Modules/sre_constants.h
python Lib/sre_constants.py
# Regen ast module
rm Lib/compiler/ast.py
pushd Tools/compiler
python astgen.py > ../../Lib/compiler/ast.py
popd
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
./configure \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-wctype-functions
}
src_compile() {
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
python symbol.py
python keyword.py
python token.py
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
}

71
sysc/python-2.3.7/stage2.sh Executable file
View File

@ -0,0 +1,71 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Remove broken file
rm Lib/test/test_pep263.py
# Delete generated files
rm Modules/glmodule.c
rm Lib/stringprep.py
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
mv ../UnicodeData-3.2.0.txt UnicodeData.txt
mv ../CompositionExclusions-3.2.0.txt CompositionExclusions.txt
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
python Lib/sre_constants.py
# Regen ast module
rm Lib/compiler/ast.py
pushd Tools/compiler
python astgen.py > ../../Lib/compiler/ast.py
popd
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
./configure \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl"
}
src_compile() {
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
python symbol.py
python keyword.py
python token.py
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
}

View File

@ -0,0 +1,31 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
There is a cycle in the build process. graminit.h requires
parsetok.c to be built, but graminit.h is included in parsetok.c.
Luckily the cycle can be broken by just NOP-ing the logic from
graminit.h.
We apply this patch before regen-ing graminit.h and revert it
afterward.
--- Parser/parsetok.c 2022-10-09 20:22:15.431229996 +1100
+++ Parser/parsetok.c 2022-10-09 20:22:57.981822483 +1100
@@ -8,7 +8,6 @@
#include "parser.h"
#include "parsetok.h"
#include "errcode.h"
-#include "graminit.h"
int Py_TabcheckFlag;
@@ -239,7 +238,7 @@
err_ret->text = text;
}
} else if (tok->encoding != NULL) {
- node* r = PyNode_New(encoding_decl);
+ node* r = NULL;
if (!r) {
err_ret->error = E_NOMEM;
n = NULL;

View File

@ -0,0 +1,28 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
frozenset() is a feature only added in Python 2.5, but we are
building Python 2.5, so we must bypass this logic. (It is not
critical, so we can just remove it).
--- Lib/keyword.py 2022-10-11 12:51:13.050744758 +1100
+++ Lib/keyword.py 2022-10-11 12:52:05.946372559 +1100
@@ -10,7 +10,7 @@
python Lib/keyword.py
"""
-__all__ = ["iskeyword", "kwlist"]
+__all__ = ["kwlist"]
kwlist = [
#--start keywords--
@@ -48,8 +48,6 @@
#--end keywords--
]
-iskeyword = frozenset(kwlist).__contains__
-
def main():
import sys, re

View File

@ -0,0 +1,33 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
musl (correctly) implements the POSIX posix_close function, however
this was added after Python 2.5.6 was released.
--- Modules/posixmodule.c 2022-10-05 18:38:46.718131893 +1100
+++ Modules/posixmodule.c 2022-10-05 18:39:07.049250312 +1100
@@ -5208,12 +5208,12 @@
}
-PyDoc_STRVAR(posix_close__doc__,
+PyDoc_STRVAR(py_posix_close__doc__,
"close(fd)\n\n\
Close a file descriptor (for low level IO).");
static PyObject *
-posix_close(PyObject *self, PyObject *args)
+py_posix_close(PyObject *self, PyObject *args)
{
int fd, res;
if (!PyArg_ParseTuple(args, "i:close", &fd))
@@ -7371,7 +7371,7 @@
{"tcsetpgrp", posix_tcsetpgrp, METH_VARARGS, posix_tcsetpgrp__doc__},
#endif /* HAVE_TCSETPGRP */
{"open", posix_open, METH_VARARGS, posix_open__doc__},
- {"close", posix_close, METH_VARARGS, posix_close__doc__},
+ {"close", py_posix_close, METH_VARARGS, py_posix_close__doc__},
{"dup", posix_dup, METH_VARARGS, posix_dup__doc__},
{"dup2", posix_dup2, METH_VARARGS, posix_dup2__doc__},
{"lseek", posix_lseek, METH_VARARGS, posix_lseek__doc__},

View File

@ -0,0 +1,29 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
sorted() was only added in Python 2.5. But we are building Python 2.5.
We cannot use .sort(), as it doesn't support the key= parameter.
Instead we just use a basic custom selection sort to sort it ourselves
using a custom key.
--- Tools/compiler/astgen.py.bak 2022-07-11 09:24:59.600238862 +1000
+++ Tools/compiler/astgen.py 2022-07-11 09:32:25.814974174 +1000
@@ -215,7 +215,15 @@
# some extra code for a Node's __init__ method
name = mo.group(1)
cur = classes[name]
- return sorted(classes.values(), key=lambda n: n.name)
+ ret = classes.values()
+ # basic custom selection sort
+ for i in range(len(ret)):
+ min_i = i
+ for j in range(i + 1, len(ret)):
+ if ret[min_i].name > ret[j].name:
+ min_i = j
+ ret[i], ret[min_i] = ret[min_i], ret[i]
+ return ret
def main():
prologue, epilogue = load_boilerplate(sys.argv[-1])

View File

@ -0,0 +1,20 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
Again, Python 2.5 added the key= argument for sorting functions,
which is not available when we are building Python 2.5.
Sorting is absolutely unnessecary when generating defines for a
header file so we can just remove it.
--- Lib/sre_constants.py 2004-08-25 12:22:30.000000000 +1000
+++ Lib/sre_constants.py 2022-10-09 20:18:40.332233858 +1100
@@ -219,7 +219,6 @@
if __name__ == "__main__":
def dump(f, d, prefix):
items = d.items()
- items.sort(key=lambda a: a[1])
for k, v in items:
f.write("#define %s_%s %s\n" % (prefix, k.upper(), v))
f = open("sre_constants.h", "w")

View File

@ -0,0 +1,84 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Remove broken file
rm Lib/test/test_pep263.py
# Delete generated files
rm Modules/glmodule.c
rm Include/Python-ast.h Python/Python-ast.c
rm Lib/stringprep.py
rm Misc/Vim/python.vim
mv Lib/plat-generic .
rm -r Lib/plat-*
rm -r Modules/_ctypes/libffi
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
for f in UnicodeData CompositionExclusions EastAsianWidth; do
mv "../${f}-3.2.0.txt" .
mv "../${f}-4.1.0.txt" "${f}.txt"
done
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
python Lib/sre_constants.py
# Regen ast module
rm Lib/compiler/ast.py
pushd Tools/compiler
python astgen.py > ../../Lib/compiler/ast.py
popd
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-system-ffi
}
src_compile() {
# Temporarily break include cycle
patch -Np0 -i graminit-regen.patch
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
python symbol.py
python keyword.py
python token.py
# Undo change
patch -Np0 -R -i graminit-regen.patch
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
}

View File

@ -0,0 +1,7 @@
https://www.python.org/ftp/python/2.5.6/Python-2.5.6.tar.bz2 57e04484de051decd4741fb4a4a3f543becc9a219af8b8063b5541e270f26dcc
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb
http://ftp.unicode.org/Public/3.2-Update/EastAsianWidth-3.2.0.txt ce19f35ffca911bf492aab6c0d3f6af3d1932f35d2064cf2fe14e10be29534cb
http://ftp.unicode.org/Public/4.1.0/ucd/UnicodeData.txt a9f03f6a061ee210c53e33782288a208bed48c65c70d307b2b214989cedfdab0 UnicodeData-4.1.0.txt
http://ftp.unicode.org/Public/4.1.0/ucd/CompositionExclusions.txt 1003a6896078e77532a017b135762501ff0a540ba33694e32b6177f093ebe6b2 CompositionExclusions-4.1.0.txt
http://ftp.unicode.org/Public/4.1.0/ucd/EastAsianWidth.txt 089ed5b2becd3196e61124d36e968474d3b7152cb5a3fb56594c34ab1e698e92 EastAsianWidth-4.1.0.txt

View File

@ -0,0 +1,31 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
There is a cycle in the build process. graminit.h requires
parsetok.c to be built, but graminit.h is included in parsetok.c.
Luckily the cycle can be broken by just NOP-ing the logic from
graminit.h.
We apply this patch before regen-ing graminit.h and revert it
afterward.
--- Parser/parsetok.c 2022-10-11 14:11:29.522466304 +1100
+++ Parser/parsetok.c 2022-10-11 14:11:42.786627172 +1100
@@ -8,7 +8,6 @@
#include "parser.h"
#include "parsetok.h"
#include "errcode.h"
-#include "graminit.h"
/* Forward */
@@ -240,7 +239,7 @@
}
}
} else if (tok->encoding != NULL) {
- node* r = PyNode_New(encoding_decl);
+ node* r = NULL;
if (!r) {
err_ret->error = E_NOMEM;
n = NULL;

View File

@ -0,0 +1,483 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
We are building Python 3 using Python 2 as our bootstrap. But
makeunicodedata has been converted to Python 3. We need to
convert back, particularly print statements, and writing to
files.
We only apply this to the first build.
--- Tools/unicode/makeunicodedata.py 2012-04-10 09:25:37.000000000 +1000
+++ Tools/unicode/makeunicodedata.py 2022-07-13 14:13:37.864821008 +1000
@@ -67,7 +67,7 @@
def maketables(trace=0):
- print("--- Reading", UNICODE_DATA % "", "...")
+ print "--- Reading", UNICODE_DATA % "", "..."
version = ""
unicode = UnicodeData(UNICODE_DATA % version,
@@ -76,15 +76,15 @@
DERIVED_CORE_PROPERTIES % version,
DERIVEDNORMALIZATION_PROPS % version)
- print(len(list(filter(None, unicode.table))), "characters")
+ print len(list(filter(None, unicode.table))), "characters"
for version in old_versions:
- print("--- Reading", UNICODE_DATA % ("-"+version), "...")
+ print "--- Reading", UNICODE_DATA % ("-"+version) + "..."
old_unicode = UnicodeData(UNICODE_DATA % ("-"+version),
COMPOSITION_EXCLUSIONS % ("-"+version),
EASTASIAN_WIDTH % ("-"+version),
DERIVED_CORE_PROPERTIES % ("-"+version))
- print(len(list(filter(None, old_unicode.table))), "characters")
+ print len(list(filter(None, old_unicode.table))), "characters"
merge_old_version(version, unicode, old_unicode)
makeunicodename(unicode, trace)
@@ -103,7 +103,7 @@
FILE = "Modules/unicodedata_db.h"
- print("--- Preparing", FILE, "...")
+ print "--- Preparing", FILE, "..."
# 1) database properties
@@ -214,92 +214,90 @@
l = comp_last[l]
comp_data[f*total_last+l] = char
- print(len(table), "unique properties")
- print(len(decomp_prefix), "unique decomposition prefixes")
- print(len(decomp_data), "unique decomposition entries:", end=' ')
- print(decomp_size, "bytes")
- print(total_first, "first characters in NFC")
- print(total_last, "last characters in NFC")
- print(len(comp_pairs), "NFC pairs")
+ print len(table), "unique properties"
+ print len(decomp_prefix), "unique decomposition prefixes"
+ print len(decomp_data), "unique decomposition entries:",
+ print decomp_size, "bytes"
+ print total_first, "first characters in NFC"
+ print total_last, "last characters in NFC"
+ print len(comp_pairs), "NFC pairs"
- print("--- Writing", FILE, "...")
+ print "--- Writing", FILE, "..."
fp = open(FILE, "w")
- print("/* this file was generated by %s %s */" % (SCRIPT, VERSION), file=fp)
- print(file=fp)
- print('#define UNIDATA_VERSION "%s"' % UNIDATA_VERSION, file=fp)
- print("/* a list of unique database records */", file=fp)
- print("const _PyUnicode_DatabaseRecord _PyUnicode_Database_Records[] = {", file=fp)
+ fp.write("/* this file was generated by %s %s */\n\n" % (SCRIPT, VERSION))
+ fp.write('#define UNIDATA_VERSION "%s"\n' % UNIDATA_VERSION)
+ fp.write("/* a list of unique database records */\n")
+ fp.write("const _PyUnicode_DatabaseRecord _PyUnicode_Database_Records[] = {\n")
for item in table:
- print(" {%d, %d, %d, %d, %d, %d}," % item, file=fp)
- print("};", file=fp)
- print(file=fp)
-
- print("/* Reindexing of NFC first characters. */", file=fp)
- print("#define TOTAL_FIRST",total_first, file=fp)
- print("#define TOTAL_LAST",total_last, file=fp)
- print("struct reindex{int start;short count,index;};", file=fp)
- print("static struct reindex nfc_first[] = {", file=fp)
+ fp.write(" {%d, %d, %d, %d, %d, %d},\n" % item)
+ fp.write("};\n\n")
+
+ fp.write("/* Reindexing of NFC first characters. */\n")
+ fp.write("#define TOTAL_FIRST %d \n" % total_first)
+ fp.write("#define TOTAL_LAST %d \n" % total_last)
+ fp.write("struct reindex{int start;short count,index;};\n")
+ fp.write("static struct reindex nfc_first[] = {\n")
for start,end in comp_first_ranges:
- print(" { %d, %d, %d}," % (start,end-start,comp_first[start]), file=fp)
- print(" {0,0,0}", file=fp)
- print("};\n", file=fp)
- print("static struct reindex nfc_last[] = {", file=fp)
+ fp.write(" { %d, %d, %d},\n" % (start,end-start,comp_first[start]))
+ fp.write(" {0,0,0}\n")
+ fp.write("};\n")
+ fp.write("static struct reindex nfc_last[] = {\n")
for start,end in comp_last_ranges:
- print(" { %d, %d, %d}," % (start,end-start,comp_last[start]), file=fp)
- print(" {0,0,0}", file=fp)
- print("};\n", file=fp)
+ fp.write(" { %d, %d, %d},\n" % (start,end-start,comp_last[start]))
+ fp.write(" {0,0,0}\n")
+ fp.write("};\n")
# FIXME: <fl> the following tables could be made static, and
# the support code moved into unicodedatabase.c
- print("/* string literals */", file=fp)
- print("const char *_PyUnicode_CategoryNames[] = {", file=fp)
+ fp.write("/* string literals */")
+ fp.write("const char *_PyUnicode_CategoryNames[] = {")
for name in CATEGORY_NAMES:
- print(" \"%s\"," % name, file=fp)
- print(" NULL", file=fp)
- print("};", file=fp)
+ fp.write(" \"%s\",\n" % name)
+ fp.write(" NULL\n")
+ fp.write("};\n")
- print("const char *_PyUnicode_BidirectionalNames[] = {", file=fp)
+ fp.write("const char *_PyUnicode_BidirectionalNames[] = {\n")
for name in BIDIRECTIONAL_NAMES:
- print(" \"%s\"," % name, file=fp)
- print(" NULL", file=fp)
- print("};", file=fp)
+ fp.write(" \"%s\",\n" % name)
+ fp.write(" NULL\n")
+ fp.write("};\n")
- print("const char *_PyUnicode_EastAsianWidthNames[] = {", file=fp)
+ fp.write("const char *_PyUnicode_EastAsianWidthNames[] = {\n")
for name in EASTASIANWIDTH_NAMES:
- print(" \"%s\"," % name, file=fp)
- print(" NULL", file=fp)
- print("};", file=fp)
+ fp.write(" \"%s\",\n" % name)
+ fp.write(" NULL\n")
+ fp.write("};\n")
- print("static const char *decomp_prefix[] = {", file=fp)
+ fp.write("static const char *decomp_prefix[] = {\n")
for name in decomp_prefix:
- print(" \"%s\"," % name, file=fp)
- print(" NULL", file=fp)
- print("};", file=fp)
+ fp.write(" \"%s\",\n" % name)
+ fp.write(" NULL\n")
+ fp.write("};\n")
# split record index table
index1, index2, shift = splitbins(index, trace)
- print("/* index tables for the database records */", file=fp)
- print("#define SHIFT", shift, file=fp)
+ fp.write("/* index tables for the database records */\n")
+ fp.write("#define SHIFT %d\n" % shift)
Array("index1", index1).dump(fp, trace)
Array("index2", index2).dump(fp, trace)
# split decomposition index table
index1, index2, shift = splitbins(decomp_index, trace)
- print("/* decomposition data */", file=fp)
+ fp.write("/* decomposition data */\n")
Array("decomp_data", decomp_data).dump(fp, trace)
- print("/* index tables for the decomposition data */", file=fp)
- print("#define DECOMP_SHIFT", shift, file=fp)
+ fp.write("/* index tables for the decomposition data */\n")
+ fp.write("#define DECOMP_SHIFT %d\n" % shift)
Array("decomp_index1", index1).dump(fp, trace)
Array("decomp_index2", index2).dump(fp, trace)
index, index2, shift = splitbins(comp_data, trace)
- print("/* NFC pairs */", file=fp)
- print("#define COMP_SHIFT", shift, file=fp)
+ fp.write("/* NFC pairs */\n")
+ fp.write("#define COMP_SHIFT %d\n" % shift)
Array("comp_index", index).dump(fp, trace)
Array("comp_data", index2).dump(fp, trace)
@@ -316,30 +314,30 @@
index[i] = cache[record] = len(records)
records.append(record)
index1, index2, shift = splitbins(index, trace)
- print("static const change_record change_records_%s[] = {" % cversion, file=fp)
+ fp.write("static const change_record change_records_%s[] = {\n" % cversion)
for record in records:
- print("\t{ %s }," % ", ".join(map(str,record)), file=fp)
- print("};", file=fp)
- Array("changes_%s_index" % cversion, index1).dump(fp, trace)
- Array("changes_%s_data" % cversion, index2).dump(fp, trace)
- print("static const change_record* get_change_%s(Py_UCS4 n)" % cversion, file=fp)
- print("{", file=fp)
- print("\tint index;", file=fp)
- print("\tif (n >= 0x110000) index = 0;", file=fp)
- print("\telse {", file=fp)
- print("\t\tindex = changes_%s_index[n>>%d];" % (cversion, shift), file=fp)
- print("\t\tindex = changes_%s_data[(index<<%d)+(n & %d)];" % \
- (cversion, shift, ((1<<shift)-1)), file=fp)
- print("\t}", file=fp)
- print("\treturn change_records_%s+index;" % cversion, file=fp)
- print("}\n", file=fp)
- print("static Py_UCS4 normalization_%s(Py_UCS4 n)" % cversion, file=fp)
- print("{", file=fp)
- print("\tswitch(n) {", file=fp)
+ fp.write("\t{ %s },\n" % ", ".join(map(str,record)))
+ fp.write("};\n")
+ Array("changes_%s_index\n" % cversion, index1).dump(fp, trace)
+ Array("changes_%s_data\n" % cversion, index2).dump(fp, trace)
+ fp.write("static const change_record* get_change_%s(Py_UCS4 n)\n" % cversion)
+ fp.write("{\n")
+ fp.write("\tint index;\n")
+ fp.write("\tif (n >= 0x110000) index = 0;\n")
+ fp.write("\telse {\n")
+ fp.write("\t\tindex = changes_%s_index[n>>%d];\n" % (cversion, shift))
+ fp.write("\t\tindex = changes_%s_data[(index<<%d)+(n & %d)];\n" % \
+ (cversion, shift, ((1<<shift)-1)))
+ fp.write("\t}\n")
+ fp.write("\treturn change_records_%s+index;\n" % cversion)
+ fp.write("}\n\n")
+ fp.write("static Py_UCS4 normalization_%s(Py_UCS4 n)\n" % cversion)
+ fp.write("{\n")
+ fp.write("\tswitch(n) {\n")
for k, v in normalization:
- print("\tcase %s: return 0x%s;" % (hex(k), v), file=fp)
- print("\tdefault: return 0;", file=fp)
- print("\t}\n}\n", file=fp)
+ fp.write("\tcase %s: return 0x%s;\n" % (hex(k), v))
+ fp.write("\tdefault: return 0;\n")
+ fp.write("\t}\n}\n\n")
fp.close()
@@ -350,7 +348,7 @@
FILE = "Objects/unicodetype_db.h"
- print("--- Preparing", FILE, "...")
+ print "--- Preparing", FILE, "..."
# extract unicode types
dummy = (0, 0, 0, 0, 0, 0)
@@ -433,25 +431,25 @@
table.append(item)
index[char] = i
- print(len(table), "unique character type entries")
+ print len(table), "unique character type entries"
- print("--- Writing", FILE, "...")
+ print "--- Writing", FILE, "..."
fp = open(FILE, "w")
- print("/* this file was generated by %s %s */" % (SCRIPT, VERSION), file=fp)
- print(file=fp)
- print("/* a list of unique character type descriptors */", file=fp)
- print("const _PyUnicode_TypeRecord _PyUnicode_TypeRecords[] = {", file=fp)
+ fp.write("/* this file was generated by %s %s */\n" % (SCRIPT, VERSION))
+ fp.write("\n")
+ fp.write("/* a list of unique character type descriptors */\n")
+ fp.write("const _PyUnicode_TypeRecord _PyUnicode_TypeRecords[] = {\n")
for item in table:
- print(" {%d, %d, %d, %d, %d, %d}," % item, file=fp)
- print("};", file=fp)
- print(file=fp)
+ fp.write(" {%d, %d, %d, %d, %d, %d},\n" % item)
+ fp.write("};\n")
+ fp.write("\n")
# split decomposition index table
index1, index2, shift = splitbins(index, trace)
- print("/* type indexes */", file=fp)
- print("#define SHIFT", shift, file=fp)
+ fp.write("/* type indexes */\n")
+ fp.write("#define SHIFT %d\n" % shift)
Array("index1", index1).dump(fp, trace)
Array("index2", index2).dump(fp, trace)
@@ -464,7 +462,7 @@
FILE = "Modules/unicodename_db.h"
- print("--- Preparing", FILE, "...")
+ print "--- Preparing", FILE, "..."
# collect names
names = [None] * len(unicode.chars)
@@ -476,7 +474,7 @@
if name and name[0] != "<":
names[char] = name + chr(0)
- print(len(list(n for n in names if n is not None)), "distinct names")
+ print len(list(n for n in names if n is not None)), "distinct names"
# collect unique words from names (note that we differ between
# words inside a sentence, and words ending a sentence. the
@@ -497,7 +495,7 @@
else:
words[w] = [len(words)]
- print(n, "words in text;", b, "bytes")
+ print n, "words in text;", b, "bytes"
wordlist = list(words.items())
@@ -511,19 +509,19 @@
escapes = 0
while escapes * 256 < len(wordlist):
escapes = escapes + 1
- print(escapes, "escapes")
+ print escapes, "escapes"
short = 256 - escapes
assert short > 0
- print(short, "short indexes in lexicon")
+ print short, "short indexes in lexicon"
# statistics
n = 0
for i in range(short):
n = n + len(wordlist[i][1])
- print(n, "short indexes in phrasebook")
+ print n, "short indexes in phrasebook"
# pick the most commonly used words, and sort the rest on falling
# length (to maximize overlap)
@@ -592,29 +590,29 @@
codehash = Hash("code", data, 47)
- print("--- Writing", FILE, "...")
+ print "--- Writing", FILE, "..."
fp = open(FILE, "w")
- print("/* this file was generated by %s %s */" % (SCRIPT, VERSION), file=fp)
- print(file=fp)
- print("#define NAME_MAXLEN", 256, file=fp)
- print(file=fp)
- print("/* lexicon */", file=fp)
+ fp.write("/* this file was generated by %s %s */\n" % (SCRIPT, VERSION))
+ fp.write("\n")
+ fp.write("#define NAME_MAXLEN 256")
+ fp.write("\n")
+ fp.write("/* lexicon */\n")
Array("lexicon", lexicon).dump(fp, trace)
Array("lexicon_offset", lexicon_offset).dump(fp, trace)
# split decomposition index table
offset1, offset2, shift = splitbins(phrasebook_offset, trace)
- print("/* code->name phrasebook */", file=fp)
- print("#define phrasebook_shift", shift, file=fp)
- print("#define phrasebook_short", short, file=fp)
+ fp.write("/* code->name phrasebook */\n")
+ fp.write("#define phrasebook_shift %d\n" % shift)
+ fp.write("#define phrasebook_short %d\n" % short)
Array("phrasebook", phrasebook).dump(fp, trace)
Array("phrasebook_offset1", offset1).dump(fp, trace)
Array("phrasebook_offset2", offset2).dump(fp, trace)
- print("/* name->code dictionary */", file=fp)
+ fp.write("/* name->code dictionary */\n")
codehash.dump(fp, trace)
fp.close()
@@ -868,7 +866,7 @@
else:
raise AssertionError("ran out of polynomials")
- print(size, "slots in hash table")
+ print size, "slots in hash table"
table = [None] * size
@@ -900,7 +898,7 @@
if incr > mask:
incr = incr ^ poly
- print(n, "collisions")
+ print n, "collisions"
self.collisions = n
for i in range(len(table)):
@@ -931,8 +929,6 @@
def dump(self, file, trace=0):
# write data to file, as a C array
size = getsize(self.data)
- if trace:
- print(self.name+":", size*len(self.data), "bytes", file=sys.stderr)
file.write("static ")
if size == 1:
file.write("unsigned char")
@@ -980,12 +976,6 @@
"""
import sys
- if trace:
- def dump(t1, t2, shift, bytes):
- print("%d+%d bins at shift %d; %d bytes" % (
- len(t1), len(t2), shift, bytes), file=sys.stderr)
- print("Size of original table:", len(t)*getsize(t), \
- "bytes", file=sys.stderr)
n = len(t)-1 # last valid index
maxshift = 0 # the most we can shift n and still have something left
if n > 0:
@@ -993,7 +983,7 @@
n >>= 1
maxshift += 1
del n
- bytes = sys.maxsize # smallest total size so far
+ bytes_size = 2**31 - 1 # smallest total size so far
t = tuple(t) # so slices can be dict keys
for shift in range(maxshift + 1):
t1 = []
@@ -1010,15 +1000,10 @@
t1.append(index >> shift)
# determine memory size
b = len(t1)*getsize(t1) + len(t2)*getsize(t2)
- if trace > 1:
- dump(t1, t2, shift, b)
- if b < bytes:
+ if b < bytes_size:
best = t1, t2, shift
- bytes = b
+ bytes_size = b
t1, t2, shift = best
- if trace:
- print("Best:", end=' ', file=sys.stderr)
- dump(t1, t2, shift, bytes)
if __debug__:
# exhaustively verify that the decomposition is correct
mask = ~((~0) << shift) # i.e., low-bit mask of shift bits
--- Lib/token.py 2012-04-10 09:25:36.000000000 +1000
+++ Lib/token.py 2022-07-13 14:13:37.893821468 +1000
@@ -93,11 +93,7 @@
outFileName = "Lib/token.py"
if len(args) > 1:
outFileName = args[1]
- try:
- fp = open(inFileName)
- except IOError as err:
- sys.stdout.write("I/O error: %s\n" % str(err))
- sys.exit(1)
+ fp = open(inFileName)
lines = fp.read().split("\n")
fp.close()
prog = re.compile(
@@ -114,7 +110,7 @@
# load the output skeleton from the target:
try:
fp = open(outFileName)
- except IOError as err:
+ except IOError:
sys.stderr.write("I/O error: %s\n" % str(err))
sys.exit(2)
format = fp.read().split("\n")
@@ -131,7 +127,7 @@
format[start:end] = lines
try:
fp = open(outFileName, 'w')
- except IOError as err:
+ except IOError:
sys.stderr.write("I/O error: %s\n" % str(err))
sys.exit(4)
fp.write("\n".join(format))

View File

@ -0,0 +1,33 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
openssl is too new for this version of Python. Tell Python build system
we don't have openssl.
--- setup.py 2022-12-19 10:51:49.749157041 +1100
+++ setup.py 2022-12-19 10:52:37.223748681 +1100
@@ -712,7 +712,7 @@
#print('openssl_ver = 0x%08x' % openssl_ver)
- if ssl_incs is not None and ssl_libs is not None:
+ if False:
if openssl_ver >= 0x00907000:
# The _hashlib module wraps optimized implementations
# of hash functions from the OpenSSL library.
@@ -727,12 +727,12 @@
else:
missing.append('_hashlib')
- if openssl_ver < 0x00908000:
+ if True:
# OpenSSL doesn't do these until 0.9.8 so we'll bring our own hash
exts.append( Extension('_sha256', ['sha256module.c']) )
exts.append( Extension('_sha512', ['sha512module.c']) )
- if openssl_ver < 0x00907000:
+ if True:
# no openssl at all, use our own md5 and sha1
exts.append( Extension('_md5', ['md5module.c']) )
exts.append( Extension('_sha1', ['sha1module.c']) )

View File

@ -0,0 +1,33 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
musl (correctly) implements the POSIX posix_close function, however
this was added after Python 3.1.5 was released.
--- Modules/posixmodule.c 2022-10-15 10:20:33.311399832 +1100
+++ Modules/posixmodule.c 2022-10-15 10:21:03.522921510 +1100
@@ -4993,12 +4993,12 @@
}
-PyDoc_STRVAR(posix_close__doc__,
+PyDoc_STRVAR(py_posix_close__doc__,
"close(fd)\n\n\
Close a file descriptor (for low level IO).");
static PyObject *
-posix_close(PyObject *self, PyObject *args)
+py_posix_close(PyObject *self, PyObject *args)
{
int fd, res;
if (!PyArg_ParseTuple(args, "i:close", &fd))
@@ -7198,7 +7198,7 @@
{"tcsetpgrp", posix_tcsetpgrp, METH_VARARGS, posix_tcsetpgrp__doc__},
#endif /* HAVE_TCSETPGRP */
{"open", posix_open, METH_VARARGS, posix_open__doc__},
- {"close", posix_close, METH_VARARGS, posix_close__doc__},
+ {"close", py_posix_close, METH_VARARGS, py_posix_close__doc__},
{"closerange", posix_closerange, METH_VARARGS, posix_closerange__doc__},
{"device_encoding", device_encoding, METH_VARARGS, device_encoding__doc__},
{"dup", posix_dup, METH_VARARGS, posix_dup__doc__},

12
sysc/python-3.1.5/sources Normal file
View File

@ -0,0 +1,12 @@
https://www.python.org/ftp/python/3.1.5/Python-3.1.5.tar.bz2 3a72a21528f0751e89151744350dd12004131d312d47b935ce8041b070c90361
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb
http://ftp.unicode.org/Public/3.2-Update/EastAsianWidth-3.2.0.txt ce19f35ffca911bf492aab6c0d3f6af3d1932f35d2064cf2fe14e10be29534cb
http://ftp.unicode.org/Public/3.2-Update/DerivedCoreProperties-3.2.0.txt 787419dde91701018d7ad4f47432eaa55af14e3fe3fe140a11e4bbf3db18bb4c
http://ftp.unicode.org/Public/3.2-Update/DerivedNormalizationProps-3.2.0.txt bab49295e5f9064213762447224ccd83cea0cced0db5dcfc96f9c8a935ef67ee
http://ftp.unicode.org/Public/5.1.0/ucd/UnicodeData.txt 8bd83e9c4e339728ecd532c5b174de5beb9cb4bab5db14e44fcd03ccb2e2c1b5 UnicodeData-5.1.0.txt
http://ftp.unicode.org/Public/5.1.0/ucd/CompositionExclusions.txt 683b094f2bdd0ab132c0bac293a5404626dd858a53b5364b3b6b525323c5a5e4 CompositionExclusions-5.1.0.txt
http://ftp.unicode.org/Public/5.1.0/ucd/EastAsianWidth.txt a0d8abf08d08f3e61875aed6011cb70c61dd8ea61089e6ad9b6cf524d8fba0f2 EastAsianWidth-5.1.0.txt
http://ftp.unicode.org/Public/5.1.0/ucd/DerivedCoreProperties.txt 8f54c77587fee99facc2f28b94e748dfdda5da44f42adab31a65f88b63587ae0 DerivedCoreProperties-5.1.0.txt
http://ftp.unicode.org/Public/5.1.0/ucd/DerivedNormalizationProps.txt 4fc8cbfa1eed578cdda0768fb4a4ace5443f807c1f652e36a6bd768e81c2c2a3 DerivedNormalizationProps-5.1.0.txt
http://ftp.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/PC/CP437.TXT 6bad4dabcdf5940227c7d81fab130dcb18a77850b5d79de28b5dc4e047b0aaac

83
sysc/python-3.1.5/stage1.sh Executable file
View File

@ -0,0 +1,83 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
patch -Np0 -i py2.patch
# Delete generated files
rm Include/Python-ast.h Python/Python-ast.c
rm Lib/stringprep.py
rm Lib/pydoc_data/topics.py
rm Misc/Vim/python.vim
rm -r Modules/_ctypes/libffi
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
for f in UnicodeData CompositionExclusions EastAsianWidth DerivedCoreProperties DerivedNormalizationProps; do
mv "../${f}-3.2.0.txt" .
mv "../${f}-5.1.0.txt" "${f}.txt"
done
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
python Lib/sre_constants.py
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--with-pydebug \
--with-system-ffi
}
src_compile() {
# Temporarily break include cycle
patch -Np0 -i graminit-regen.patch
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
python symbol.py
python keyword.py
python token.py
# Undo change
patch -Np0 -R -i graminit-regen.patch
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
ln -s "${PREFIX}/lib/musl/python3.1/lib-dynload" "${DESTDIR}${PREFIX}/lib/python3.1/lib-dynload"
ln -s "${PREFIX}/bin/python3.1" "${DESTDIR}${PREFIX}/bin/python"
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
# This file is not reproducible and I don't care to fix it
rm "${DESTDIR}/${PREFIX}/lib/python3.1/lib2to3/"{Pattern,}"Grammar3.1.5.final.0.pickle"
}

88
sysc/python-3.1.5/stage2.sh Executable file
View File

@ -0,0 +1,88 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files
rm Include/Python-ast.h Python/Python-ast.c
rm Lib/stringprep.py
rm Lib/pydoc_data/topics.py
rm Misc/Vim/python.vim
rm -r Modules/_ctypes/libffi
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate encodings
mkdir Tools/unicode/in Tools/unicode/out
mv ../CP437.TXT Tools/unicode/in/
pushd Tools/unicode
python gencodec.py in/ ../../Lib/encodings/
popd
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
for f in UnicodeData CompositionExclusions EastAsianWidth DerivedCoreProperties DerivedNormalizationProps; do
mv "../${f}-3.2.0.txt" .
mv "../${f}-5.1.0.txt" "${f}.txt"
done
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
python2.5 Lib/sre_constants.py
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--with-pydebug \
--with-system-ffi
}
src_compile() {
# Temporarily break include cycle
patch -Np0 -i graminit-regen.patch
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
python symbol.py
python keyword.py
python token.py
# Undo change
patch -Np0 -R -i graminit-regen.patch
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
ln -s "${PREFIX}/lib/musl/python3.1/lib-dynload" "${DESTDIR}${PREFIX}/lib/python3.1/lib-dynload"
ln -s "${PREFIX}/bin/python3.1" "${DESTDIR}${PREFIX}/bin/python"
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
# This file is not reproducible and I don't care to fix it
rm "${DESTDIR}/${PREFIX}/lib/python3.1/lib2to3/"{Pattern,}"Grammar3.1.5.final.0.pickle"
}

View File

@ -0,0 +1,20 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
When defined __DATE__ is in the format MMM DD YYYY. xx/xx/xx does
not comply with this so the parser in Lib/platform.py fails on it.
Upstream PR: https://github.com/python/cpython/pull/100389
--- Modules/getbuildinfo.c 2022-12-21 13:41:18.120573458 +1100
+++ Modules/getbuildinfo.c 2022-12-21 13:41:30.399716652 +1100
@@ -8,7 +8,7 @@
#ifdef __DATE__
#define DATE __DATE__
#else
-#define DATE "xx/xx/xx"
+#define DATE "xxx xx xxxx"
#endif
#endif

View File

@ -0,0 +1,19 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
On x86_64 system our GCC has multiarch a little broken for Python's
build system which doesn't really bother me so just pretend that
there is no multiarch.
--- configure.ac.bak 2022-12-21 19:35:44.560977616 +1100
+++ configure.ac 2022-12-21 19:36:00.735143246 +1100
@@ -1096,7 +1096,7 @@
AS_CASE([$ac_sys_system],
[Darwin*], [MULTIARCH=""],
[FreeBSD*], [MULTIARCH=""],
- [MULTIARCH=$($CC --print-multiarch 2>/dev/null)]
+ [MULTIARCH=""]
)
AC_SUBST([MULTIARCH])
AC_MSG_RESULT([$MULTIARCH])

View File

@ -0,0 +1,97 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files that won't be regenerated
rm Lib/pydoc_data/topics.py \
Misc/stable_abi.toml
# Regenerate ssl_data for ssl module
rm Modules/_ssl_data_300.h Modules/_ssl_data.h
python Tools/ssl/make_ssl_data.py ../openssl-1.1.1l Modules/_ssl_data_111.h
# Regenerate encodings
grep generated -r . -l | grep encodings | xargs rm
mkdir Tools/unicode/in Tools/unicode/out
mv ../CP437.TXT Tools/unicode/in/
pushd Tools/unicode
python gencodec.py in/ ../../Lib/encodings/
popd
# Regenerate stringprep
rm Lib/stringprep.py
mv ../rfc3454.txt .
python Tools/unicode/mkstringprep.py > Lib/stringprep.py
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
mkdir -p Tools/unicode/data
mv ../*.txt ../*.zip Tools/unicode/data/
python Tools/unicode/makeunicodedata.py
# Regenerate Lib/re/_casefix.py
rm Lib/re/_casefix.py
python Tools/scripts/generate_re_casefix.py Lib/re/_casefix.py
# Regenerate Programs/test_frozenmain.h
rm Programs/test_frozenmain.h
python Programs/freeze_test_frozenmain.py Programs/test_frozenmain.h
# Create dummy Python/stdlib_module_names.h
echo 'static const char* _Py_stdlib_module_names[] = {};' > Python/stdlib_module_names.h
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CPPFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-system-ffi
}
src_compile() {
# Regenerations
# We have to choose the order ourselves because the Makefile is extremely lax about the order
# First of all, do everything that doesn't use any C
rm Modules/_blake2/blake2s_impl.c
make regen-opcode \
regen-opcode-targets \
regen-typeslots \
regen-token \
regen-ast \
regen-keyword \
regen-sre \
clinic \
regen-pegen-metaparser \
regen-pegen \
regen-global-objects
# Do the freeze regen process
make regen-frozen
make regen-deepfreeze
make regen-global-objects
make CPPFLAGS="-U__DATE__ -U__TIME__"
# Regen Python/stdlib_module_names.h (you must have an existing build first)
make regen-stdlib-module-names
# Now rebuild with proper stdlib_module_names.h
make CPPFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
ln -s "${PREFIX}/lib/musl/python3.11/lib-dynload" "${DESTDIR}${PREFIX}/lib/python3.11/lib-dynload"
ln -s "${PREFIX}/bin/python3.11" "${DESTDIR}${PREFIX}/bin/python"
}

View File

@ -0,0 +1,24 @@
https://www.python.org/ftp/python/3.11.1/Python-3.11.1.tar.xz 85879192f2cffd56cb16c092905949ebf3e5e394b7f764723529637901dfb58f
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb
http://ftp.unicode.org/Public/3.2-Update/EastAsianWidth-3.2.0.txt ce19f35ffca911bf492aab6c0d3f6af3d1932f35d2064cf2fe14e10be29534cb
http://ftp.unicode.org/Public/3.2-Update/DerivedCoreProperties-3.2.0.txt 787419dde91701018d7ad4f47432eaa55af14e3fe3fe140a11e4bbf3db18bb4c
http://ftp.unicode.org/Public/3.2-Update/DerivedNormalizationProps-3.2.0.txt bab49295e5f9064213762447224ccd83cea0cced0db5dcfc96f9c8a935ef67ee
http://ftp.unicode.org/Public/3.2-Update/LineBreak-3.2.0.txt d693ef2a603d07e20b769ef8ba29afca39765588a03e3196294e5be8638ca735
http://ftp.unicode.org/Public/3.2-Update/SpecialCasing-3.2.0.txt 1f7913b74dddff55ee566f6220aa9e465bae6f27709fc21d353b04adb8572b37
http://ftp.unicode.org/Public/3.2-Update/CaseFolding-3.2.0.txt 370f3d1e79a52791c42065946711f4eddb6d9820726afd0e436a3c50360475a9
http://ftp.unicode.org/Public/3.2-Update/Unihan-3.2.0.zip 0582b888c4ebab6e3ce8d340c74788f1a68ca662713a1065b9a007f24bb4fe46
http://ftp.unicode.org/Public/14.0.0/ucd/UnicodeData.txt 36018e68657fdcb3485f636630ffe8c8532e01c977703d2803f5b89d6c5feafb UnicodeData-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/CompositionExclusions.txt 3360762fc3295cea54ab251c31df621d05ba4b94d46c60eaac29aa16d70ad1e0 CompositionExclusions-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/EastAsianWidth.txt f901ac011aa32a09224d6555da71e2532c59c1d3381322829de0e3b880507250 EastAsianWidth-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/DerivedCoreProperties.txt e3eddd7d469cd1b0feed7528defad1a1cc7c6a9ceb0ae4446a6d10921ed2e7bc DerivedCoreProperties-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/DerivedNormalizationProps.txt b2c444c20730b097787fdf50bd7d6dd3fc5256ab8084f5b35b11c8776eca674c DerivedNormalizationProps-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/LineBreak.txt 9e06e9f35c6959fb91dcc7993f90d58523c3079bc62c6b25f828b4cdebc5d70c LineBreak-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/NameAliases.txt 14b3b677d33f95c51423dce6eef4a6a28b4b160451ecedee4b91edb6745cf4a3 NameAliases-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/NamedSequences.txt db5745688affcdc0c3927a1ee0667018a96a7b24513f866d5235e98fef6c2436 NamedSequences-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/SpecialCasing.txt c667b45908fd269af25fd55d2fc5bbc157fb1b77675936e25c513ce32e080334 SpecialCasing-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/CaseFolding.txt a566cd48687b2cd897e02501118b2413c14ae86d318f9abbbba97feb84189f0f CaseFolding-14.0.0.txt
http://ftp.unicode.org/Public/14.0.0/ucd/Unihan.zip 2ae4519b2b82cd4d15379c17e57bfb12c33c0f54da4977de03b2b04bcf11852d Unihan-14.0.0.zip
http://ftp.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/PC/CP437.TXT 6bad4dabcdf5940227c7d81fab130dcb18a77850b5d79de28b5dc4e047b0aaac
https://www.ietf.org/rfc/rfc3454.txt eb722fa698fb7e8823b835d9fd263e4cdb8f1c7b0d234edf7f0e3bd2ccbb2c79
http://artfiles.org/openssl.org/source/old/1.1.1/openssl-1.1.1l.tar.gz 0b7a3e5e59c34827fe0c3a74b7ec8baef302b98fa80088d7f9153aa16fa76bd1

View File

@ -0,0 +1,22 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
token is in the standard library which takes precedence over files
in the path. Rename this file so we can actually import it.
--- Lib/symbol.py 2022-12-19 21:52:07.101953334 +1100
+++ Lib/symbol.py 2022-12-19 21:52:14.752082879 +1100
@@ -102,10 +102,10 @@
def main():
import sys
- import token
+ import _token
if len(sys.argv) == 1:
sys.argv = sys.argv + ["Include/graminit.h", "Lib/symbol.py"]
- token._main()
+ _token._main()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,86 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files
rm Include/Python-ast.h Python/Python-ast.c
rm Lib/stringprep.py
rm Lib/pydoc_data/topics.py
rm Misc/Vim/python.vim
rm -r Modules/_ctypes/libffi
rm Python/importlib.h
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate encodings
mkdir Tools/unicode/in Tools/unicode/out
mv ../CP437.TXT Tools/unicode/in/
pushd Tools/unicode
python gencodec.py in/ ../../Lib/encodings/
popd
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
mv ../*.txt ../*.zip .
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
cp Lib/sre_constants.py .
python sre_constants.py
# Regenerate _ssl_data.h
python Tools/ssl/make_ssl_data.py /usr/include/openssl Modules/_ssl_data.h
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-system-ffi
}
src_compile() {
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
cp token.py _token.py
python symbol.py
python keyword.py
python token.py
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
ln -s "${PREFIX}/lib/musl/python3.3/lib-dynload" "${DESTDIR}${PREFIX}/lib/python3.3/lib-dynload"
ln -s "${PREFIX}/bin/python3.3" "${DESTDIR}${PREFIX}/bin/python"
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
# This file is not reproducible and I don't care to fix it
rm "${DESTDIR}/${PREFIX}/lib/python3.3/lib2to3/"{Pattern,}"Grammar3.3.7.final.0.pickle"
}

22
sysc/python-3.3.7/sources Normal file
View File

@ -0,0 +1,22 @@
https://www.python.org/ftp/python/3.3.7/Python-3.3.7.tar.xz 85f60c327501c36bc18c33370c14d472801e6af2f901dafbba056f61685429fe
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb
http://ftp.unicode.org/Public/3.2-Update/EastAsianWidth-3.2.0.txt ce19f35ffca911bf492aab6c0d3f6af3d1932f35d2064cf2fe14e10be29534cb
http://ftp.unicode.org/Public/3.2-Update/DerivedCoreProperties-3.2.0.txt 787419dde91701018d7ad4f47432eaa55af14e3fe3fe140a11e4bbf3db18bb4c
http://ftp.unicode.org/Public/3.2-Update/DerivedNormalizationProps-3.2.0.txt bab49295e5f9064213762447224ccd83cea0cced0db5dcfc96f9c8a935ef67ee
http://ftp.unicode.org/Public/3.2-Update/LineBreak-3.2.0.txt d693ef2a603d07e20b769ef8ba29afca39765588a03e3196294e5be8638ca735
http://ftp.unicode.org/Public/3.2-Update/SpecialCasing-3.2.0.txt 1f7913b74dddff55ee566f6220aa9e465bae6f27709fc21d353b04adb8572b37
http://ftp.unicode.org/Public/3.2-Update/CaseFolding-3.2.0.txt 370f3d1e79a52791c42065946711f4eddb6d9820726afd0e436a3c50360475a9
http://ftp.unicode.org/Public/3.2-Update/Unihan-3.2.0.zip 0582b888c4ebab6e3ce8d340c74788f1a68ca662713a1065b9a007f24bb4fe46
http://ftp.unicode.org/Public/6.1.0/ucd/UnicodeData.txt 3066262585a3c4f407b16db787e6d3a6e033b90f27405b6c76d1babefffca6ad UnicodeData-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/CompositionExclusions.txt 21124f9d38372d68e09c67bcb64694fd4bca0c9cb39c576b1f095554c4ea9693 CompositionExclusions-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/EastAsianWidth.txt d591c24b702c1b025b58ca6168746f713b657c6e252c268f52cb07758f428067 EastAsianWidth-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/DerivedCoreProperties.txt a03e62ba5fa9c6f327b6e6cfc5d014f59af9b262b768dd9a6aaa39d205dd8b7a DerivedCoreProperties-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/DerivedNormalizationProps.txt d028f7eccab4998f8d7a6b15703b088e26ff6ee1f2dbc0939ae872c213de8620 DerivedNormalizationProps-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/LineBreak.txt 7b7e2cf582ef7f24fd2747a4ef1a50934c15a0fc0ab10ce737d5e3e47bebde0d LineBreak-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/NameAliases.txt 7253bd84e20d34491b2b124a85ca84bd2cd5d113e4957aebae92f0e3c21f0a45 NameAliases-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/NamedSequences.txt 60c88b6e3ceec871cc6b7e2d552453f88eef0f40ff2188d9cec7021c2debd36a NamedSequences-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/SpecialCasing.txt 7d047fe1aa8a68cc12101427cf03bfbce83201ee277e907822901735f0bfee3c SpecialCasing-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/CaseFolding.txt 4c0bece13821a24f469bb8d16ea33fc7da6436b7ebe64c78635673dbfaa88edc CaseFolding-6.1.0.txt
http://ftp.unicode.org/Public/6.1.0/ucd/Unihan.zip 8ca508ef1bc7eba8c102710016d8510f871f69bdcc74ff877c33d01bb799a38f Unihan-6.1.0.zip
http://ftp.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/PC/CP437.TXT 6bad4dabcdf5940227c7d81fab130dcb18a77850b5d79de28b5dc4e047b0aaac

View File

@ -0,0 +1,22 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
token is in the standard library which takes precedence over files
in the path. Rename this file so we can actually import it.
--- Lib/symbol.py 2022-12-19 21:52:07.101953334 +1100
+++ Lib/symbol.py 2022-12-19 21:52:14.752082879 +1100
@@ -102,10 +102,10 @@
def main():
import sys
- import token
+ import _token
if len(sys.argv) == 1:
sys.argv = sys.argv + ["Include/graminit.h", "Lib/symbol.py"]
- token._main()
+ _token._main()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,89 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files
rm Include/Python-ast.h Python/Python-ast.c
rm Lib/stringprep.py
rm Lib/pydoc_data/topics.py
rm -r Modules/_ctypes/libffi
rm Python/importlib.h
rm Modules/_ssl_data.h # Breaks _ssl module, but it fails anyways
mv Lib/plat-generic .
rm -r Lib/plat-*
mv plat-generic Lib/
grep generated -r . -l | grep encodings | xargs rm
# Regenerate encodings
mkdir Tools/unicode/in Tools/unicode/out
mv ../CP437.TXT Tools/unicode/in/
pushd Tools/unicode
python gencodec.py in/ ../../Lib/encodings/
popd
# Regenerate clinic
find . -name "*.c" -or -name "*.h" | \
xargs grep 'clinic input' -l | \
xargs -L 1 python Tools/clinic/clinic.py
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
mv ../*.txt ../*.zip .
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
cp Lib/sre_constants.py .
python sre_constants.py
mv sre_constants.h Modules/
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-system-ffi
}
src_compile() {
# Build pgen
make Parser/pgen
# Regen graminit.c and graminit.h
make Include/graminit.h
# Regenerate some Python scripts using the other regenerated files
# Must move them out to avoid using Lib/ module files which are
# incompatible with running version of Python
cp Lib/{symbol,keyword,token}.py .
cp token.py _token.py
python symbol.py
python keyword.py
python token.py
# Now build the main program
make CFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
ln -s "${PREFIX}/lib/musl/python3.4/lib-dynload" "${DESTDIR}${PREFIX}/lib/python3.4/lib-dynload"
ln -s "${PREFIX}/bin/python3.4" "${DESTDIR}${PREFIX}/bin/python"
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
# This file is not reproducible and I don't care to fix it
rm "${DESTDIR}/${PREFIX}/lib/python3.4/lib2to3/"{Pattern,}"Grammar3.4.10.final.0.pickle"
}

View File

@ -0,0 +1,22 @@
https://www.python.org/ftp/python/3.4.10/Python-3.4.10.tar.xz d46a8f6fe91679e199c671b1b0a30aaf172d2acb5bcab25beb35f16c3d195b4e
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb
http://ftp.unicode.org/Public/3.2-Update/EastAsianWidth-3.2.0.txt ce19f35ffca911bf492aab6c0d3f6af3d1932f35d2064cf2fe14e10be29534cb
http://ftp.unicode.org/Public/3.2-Update/DerivedCoreProperties-3.2.0.txt 787419dde91701018d7ad4f47432eaa55af14e3fe3fe140a11e4bbf3db18bb4c
http://ftp.unicode.org/Public/3.2-Update/DerivedNormalizationProps-3.2.0.txt bab49295e5f9064213762447224ccd83cea0cced0db5dcfc96f9c8a935ef67ee
http://ftp.unicode.org/Public/3.2-Update/LineBreak-3.2.0.txt d693ef2a603d07e20b769ef8ba29afca39765588a03e3196294e5be8638ca735
http://ftp.unicode.org/Public/3.2-Update/SpecialCasing-3.2.0.txt 1f7913b74dddff55ee566f6220aa9e465bae6f27709fc21d353b04adb8572b37
http://ftp.unicode.org/Public/3.2-Update/CaseFolding-3.2.0.txt 370f3d1e79a52791c42065946711f4eddb6d9820726afd0e436a3c50360475a9
http://ftp.unicode.org/Public/3.2-Update/Unihan-3.2.0.zip 0582b888c4ebab6e3ce8d340c74788f1a68ca662713a1065b9a007f24bb4fe46
http://ftp.unicode.org/Public/6.3.0/ucd/UnicodeData.txt 3f76924f0410ca8ae0e9b5c59bd1ba03196293c32616204b393300f091f52013 UnicodeData-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/CompositionExclusions.txt 4ba8ea079ffbffc0025fc31009e95726864feda90d2845c9363c0c40ded8511c CompositionExclusions-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/EastAsianWidth.txt bbdf9281767ca4601af3623b62c26ecb834a9f4c46eec629d82339b006da00d8 EastAsianWidth-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/DerivedCoreProperties.txt 790826f4cfa82c5845ab4040b5e811f1e67bf1ec4c88cdbf722795c3292b0102 DerivedCoreProperties-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/DerivedNormalizationProps.txt c5e867ae043fe5d1cf713150d859356bfdcdba291c39f584af0bfb943f1a9743 DerivedNormalizationProps-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/LineBreak.txt 6a38069025127a60f4a809e788fbbd1bb6b95ac8d1bd62e6a78d7870357f3486 LineBreak-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/NameAliases.txt a11bed87ec6f264edcf84d581dd2d7ac8ed7ac1c3b2ccb54a83077fdbd34133e NameAliases-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/NamedSequences.txt 91fc69ff68b1a89e5f7270545547c747624bc96b0e6c23a791d4265d2fa1f988 NamedSequences-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/SpecialCasing.txt 9edafba261e23e72f6e21e3d85d7f15dd4866f38004ab3bfdc6f7057c589d034 SpecialCasing-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/CaseFolding.txt 21323e682a2b34400c6af4ab57b9775b7e716150428f092bac5b005a88ab8f42 CaseFolding-6.3.0.txt
http://ftp.unicode.org/Public/6.3.0/ucd/Unihan.zip 9e408d71e3aba4ff68f5085569bc1c31c9751f9779f55cf877c222467732991f Unihan-6.3.0.zip
http://ftp.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/PC/CP437.TXT 6bad4dabcdf5940227c7d81fab130dcb18a77850b5d79de28b5dc4e047b0aaac

View File

@ -0,0 +1,18 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
When defined __DATE__ is in the format MMM DD YYYY. xx/xx/xx does
not comply with this so the parser in Lib/platform.py fails on it.
--- Modules/getbuildinfo.c 2022-12-21 13:41:18.120573458 +1100
+++ Modules/getbuildinfo.c 2022-12-21 13:41:30.399716652 +1100
@@ -8,7 +8,7 @@
#ifdef __DATE__
#define DATE __DATE__
#else
-#define DATE "xx/xx/xx"
+#define DATE "xxx xx xxxx"
#endif
#endif

View File

@ -0,0 +1,20 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
MAXGROUPS constant was introduced in this upgrade. Hardcode it
from a "normal" system. This seems particularly unproblematic
if it is wrong, since it does next to nothing.
--- Lib/sre_constants.py 2022-12-20 12:05:01.176104156 +1100
+++ Lib/sre_constants.py 2022-12-20 12:05:21.710376396 +1100
@@ -15,7 +15,8 @@
MAGIC = 20171005
-from _sre import MAXREPEAT, MAXGROUPS
+from _sre import MAXREPEAT
+MAXGROUPS = 1073741823
# SRE standard exception (access as sre.error)
# should this really be here?

View File

@ -0,0 +1,26 @@
SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
SPDX-License-Identifier: PSF-2.0
I'm not sure what was going on here when this was written, or how
it ever worked! But this small simple fix works 0.0
--- Lib/sre_constants.py 2022-12-20 18:30:21.883561534 +1100
+++ Lib/sre_constants.py 2022-12-20 18:31:23.209190748 +1100
@@ -56,6 +56,7 @@
class _NamedIntConstant(int):
def __new__(cls, value, name):
self = super(_NamedIntConstant, cls).__new__(cls, value)
+ self.value = value
self.name = name
return self
@@ -219,7 +220,7 @@
def dump(f, d, prefix):
items = sorted(d)
for item in items:
- f.write("#define %s_%s %d\n" % (prefix, item, item))
+ f.write("#define %s_%s %d\n" % (prefix, item.name, item.value))
with open("sre_constants.h", "w") as f:
f.write("""\
/*

View File

@ -0,0 +1,69 @@
# SPDX-FileCopyrightText: 2022 fosslinux <fosslinux@aussies.space>
#
# SPDX-License-Identifier: GPL-3.0-or-later
src_prepare() {
default
# Delete generated files that won't be regenerated
rm Lib/pydoc_data/topics.py
rm Modules/_ssl_data*.h # Breaks _ssl module, but it fails anyways
# Regenerate encodings
grep generated -r . -l | grep encodings | xargs rm
mkdir Tools/unicode/in Tools/unicode/out
mv ../CP437.TXT Tools/unicode/in/
pushd Tools/unicode
python gencodec.py in/ ../../Lib/encodings/
popd
# Regenerate unicode
rm Modules/unicodedata_db.h Modules/unicodename_db.h Objects/unicodetype_db.h
mv ../*.txt ../*.zip .
python Tools/unicode/makeunicodedata.py
# Regenerate sre_constants.h
rm Modules/sre_constants.h
cp Lib/sre_constants.py .
python sre_constants.py
rm sre_constants.py
mv sre_constants.h Modules/
# Regenerate stringprep
rm Lib/stringprep.py
python Tools/unicode/mkstringprep.py > Lib/stringprep.py
# Regenerate autoconf
autoreconf-2.71 -fi
}
src_configure() {
MACHDEP=linux ac_sys_system=Linux \
CPPFLAGS="-U__DATE__ -U__TIME__" \
LDFLAGS="-L/usr/lib/musl" \
./configure \
--build=i386-unknown-linux-musl \
--host=i386-unknown-linux-musl \
--prefix="${PREFIX}" \
--libdir="${PREFIX}/lib/musl" \
--with-system-ffi
}
src_compile() {
# Regenerations
rm Modules/_blake2/blake2s_impl.c
make regen-all
make CPPFLAGS="-U__DATE__ -U__TIME__"
}
src_install() {
default
ln -s "${PREFIX}/lib/musl/python3.8/lib-dynload" "${DESTDIR}${PREFIX}/lib/python3.8/lib-dynload"
ln -s "${PREFIX}/bin/python3.8" "${DESTDIR}${PREFIX}/bin/python"
# Remove non-reproducible .pyc/o files
find "${DESTDIR}" -name "*.pyc" -delete
find "${DESTDIR}" -name "*.pyo" -delete
}

View File

@ -0,0 +1,23 @@
https://www.python.org/ftp/python/3.8.16/Python-3.8.16.tar.xz d85dbb3774132473d8081dcb158f34a10ccad7a90b96c7e50ea4bb61f5ce4562
http://ftp.unicode.org/Public/3.2-Update/UnicodeData-3.2.0.txt 5e444028b6e76d96f9dc509609c5e3222bf609056f35e5fcde7e6fb8a58cd446
http://ftp.unicode.org/Public/3.2-Update/CompositionExclusions-3.2.0.txt 1d3a450d0f39902710df4972ac4a60ec31fbcb54ffd4d53cd812fc1200c732cb
http://ftp.unicode.org/Public/3.2-Update/EastAsianWidth-3.2.0.txt ce19f35ffca911bf492aab6c0d3f6af3d1932f35d2064cf2fe14e10be29534cb
http://ftp.unicode.org/Public/3.2-Update/DerivedCoreProperties-3.2.0.txt 787419dde91701018d7ad4f47432eaa55af14e3fe3fe140a11e4bbf3db18bb4c
http://ftp.unicode.org/Public/3.2-Update/DerivedNormalizationProps-3.2.0.txt bab49295e5f9064213762447224ccd83cea0cced0db5dcfc96f9c8a935ef67ee
http://ftp.unicode.org/Public/3.2-Update/LineBreak-3.2.0.txt d693ef2a603d07e20b769ef8ba29afca39765588a03e3196294e5be8638ca735
http://ftp.unicode.org/Public/3.2-Update/SpecialCasing-3.2.0.txt 1f7913b74dddff55ee566f6220aa9e465bae6f27709fc21d353b04adb8572b37
http://ftp.unicode.org/Public/3.2-Update/CaseFolding-3.2.0.txt 370f3d1e79a52791c42065946711f4eddb6d9820726afd0e436a3c50360475a9
http://ftp.unicode.org/Public/3.2-Update/Unihan-3.2.0.zip 0582b888c4ebab6e3ce8d340c74788f1a68ca662713a1065b9a007f24bb4fe46
http://ftp.unicode.org/Public/12.1.0/ucd/UnicodeData.txt 93ab1acd8fd9d450463b50ae77eab151a7cda48f98b25b56baed8070f80fc936 UnicodeData-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/CompositionExclusions.txt abc8394c5bde62453118b00c1c5842160a04d7fffb2e829ee5426b846596d081 CompositionExclusions-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/EastAsianWidth.txt 904500178b2e752635bef27aaed3a2a3718a100bce35ff96b3890be7a8315d8f EastAsianWidth-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/DerivedCoreProperties.txt a6eb7a8671fb532fbd88c37fd7b20b5b2e7dbfc8b121f74c14abe2947db0da68 DerivedCoreProperties-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/DerivedNormalizationProps.txt 92dcdda84142194a1596f22180fcdf8c0e7f86897f09cc9203c7dc636c549f5f DerivedNormalizationProps-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/LineBreak.txt 961f842fc70b5afd1d82c6645e68c10d1f701382aed38ae38cb2ff27f671903c LineBreak-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/NameAliases.txt ff61a0687d2f32c0dd1094254b8bde967883b43c2d4d50fd17531d498e41ab2c NameAliases-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/NamedSequences.txt d3eb9a288ebeaf9de1237989f490705e287b6f610b59d2459fb1b7c2d8e39c39 NamedSequences-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/SpecialCasing.txt 817ce2e9edca8e075a153f54b8f3b020345e37652cd2bda9b1495c366af17e7e SpecialCasing-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/CaseFolding.txt 9c772627c6ee77eea6a17b42927b8ee28ca05dc65d6a511062104baaf3d12294 CaseFolding-12.1.0.txt
http://ftp.unicode.org/Public/12.1.0/ucd/Unihan.zip 6e4553f3b5fffe0d312df324d020ef1278d9595932ae03f4e8a2d427de83cdcd Unihan-12.1.0.zip
http://ftp.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/PC/CP437.TXT 6bad4dabcdf5940227c7d81fab130dcb18a77850b5d79de28b5dc4e047b0aaac
https://www.ietf.org/rfc/rfc3454.txt eb722fa698fb7e8823b835d9fd263e4cdb8f1c7b0d234edf7f0e3bd2ccbb2c79

View File

@ -103,6 +103,28 @@ build autogen-5.18.16 autogen-5.18.16.sh
build musl-1.2.3
build python-2.0.1 stage1.sh
build python-2.0.1 stage2.sh
build python-2.3.7 stage1.sh
build python-2.3.7 stage2.sh
build python-2.5.6
build python-3.1.5 stage1.sh
build python-3.1.5 stage2.sh
build python-3.3.7
build python-3.4.10
build python-3.8.16
build python-3.11.1
if [ "$FORCE_TIMESTAMPS" = True ] ; then
echo 'Forcing all files timestamps to be 0 unix time.'
canonicalise_all_files_timestamp