New upstream version 0.4.5
Paul Brossier
6 years ago
0 | 2017-04-10 Paul Brossier <piem@aubio.org> | |
1 | ||
2 | [Overview] | |
3 | ||
4 | * VERSION: bump to 0.4.5 | |
5 | * src/io/source_avcodec.c: add support for libswresample | |
6 | * aubio: new python command line tool to extract information | |
7 | * src/onset/onset.c: add spectral whitening and compression, improve default | |
8 | parameters | |
9 | * this_version.py: use centralized script to get current version, adding git | |
10 | sha when building from git repo (thanks to MartinHN) | |
11 | ||
12 | [Interface] | |
13 | ||
14 | * src/spectral/awhithening.h: add adaptive whitening | |
15 | * src/{cvec,mathutils,musicutils}.h: add cvec_logmag, fvec_logmag, and fvec_push | |
16 | * src/onset/onset.h: add aubio_onset_set_default_parameters to load optimal | |
17 | parameters of each novelty function, _{set,get}_compression and | |
18 | _{set,get}_awhitening to turn on/off compression and adaptive whitening | |
19 | * src/spectral/specdesc.h: add weighted phase | |
20 | ||
21 | [Library] | |
22 | ||
23 | * src/onset/onset.c: improve default onset parameters (thanks to @superbock | |
24 | for access to his evaluation database), see commit dccfad2 for more details | |
25 | * src/pitch/pitch.c: avoid segfault when using invalid parameters | |
26 | * src/temporal/biquad.c: fix biquad parameters initialization (thanks to | |
27 | @jurlhardt) | |
28 | ||
29 | [Tools] | |
30 | ||
31 | * examples/aubio{onset,track}.c: add options --miditap-note and | |
32 | --miditap-velo to set which midi note is triggered at onset/beat (thanks to | |
33 | @tseaver) | |
34 | * examples/aubioonset.c: show actual parameters in verbose mode | |
35 | * examples/utils.c: improve memory usage to emit midi notes | |
36 | ||
37 | [Python] | |
38 | ||
39 | * python/ext/py-source.c: add with (PEP 343) and iter (PEP 234) interface | |
40 | * python/ext/py-sink.c: add with interface (PEP 343) | |
41 | * python/lib/aubio/cmd.py: new `aubio` command line tool | |
42 | * python/lib/aubio/cut.py: moved from python/scripts/aubiocut | |
43 | ||
44 | [Documentation] | |
45 | ||
46 | * doc/*.rst: reorganize and improve sphinx manual | |
47 | * doc/*.txt: update manpages, add simple manpage for aubio command line | |
48 | * doc/full.cfg: derive from doc/web.cfg | |
49 | * README.md: simplify and add contribute information | |
50 | ||
51 | [Build system] | |
52 | ||
53 | * wscript: prefer libswresample over libavsamplerate when available, use | |
54 | current version in manpages, doxygen, and sphinx, update to newest waf | |
55 | * setup.py: use entry_points console_scripts to generate scripts, use | |
56 | centralized version from this_version.py, clean up | |
57 | * python/lib/moresetuptools.py: detect if libswresample is available | |
58 | ||
0 | 59 | 2017-01-08 Paul Brossier <piem@aubio.org> |
1 | 60 | |
2 | 61 | [ Overview ] |
0 | 0 | include AUTHORS COPYING README.md VERSION ChangeLog |
1 | 1 | include python/README.md |
2 | include this_version.py | |
2 | 3 | include Makefile wscript */wscript_build |
3 | 4 | include waf waflib/* waflib/*/* |
4 | 5 | exclude waflib/__pycache__/* |
9 | 9 | # $ make test_python |
10 | 10 | |
11 | 11 | WAFCMD=python waf |
12 | WAFURL=https://waf.io/waf-1.9.6 | |
13 | 12 | |
14 | 13 | #WAFOPTS:= |
15 | 14 | # turn on verbose mode |
0 | aubio library | |
1 | ============= | |
0 | aubio | |
1 | ===== | |
2 | ||
3 | [![Travis build status](https://travis-ci.org/aubio/aubio.svg?branch=master)](https://travis-ci.org/aubio/aubio "Travis build status") | |
4 | [![Appveyor build status](https://img.shields.io/appveyor/ci/piem/aubio/master.svg)](https://ci.appveyor.com/project/piem/aubio "Appveyor build status") | |
5 | [![Landscape code health](https://landscape.io/github/aubio/aubio/master/landscape.svg?style=flat)](https://landscape.io/github/aubio/aubio/master "Landscape code health") | |
6 | [![Commits since last release](https://img.shields.io/github/commits-since/aubio/aubio/0.4.4.svg)](https://github.com/aubio/aubio "Commits since last release") | |
7 | ||
8 | [![Documentation](https://readthedocs.org/projects/aubio/badge/?version=latest)](http://aubio.readthedocs.io/en/latest/?badge=latest "Latest documentation") | |
9 | [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.438682.svg)](https://doi.org/10.5281/zenodo.438682) | |
2 | 10 | |
3 | 11 | aubio is a library to label music and sounds. It listens to audio signals and |
4 | 12 | attempts to detect events. For instance, when a drum is hit, at which frequency |
19 | 27 | - digital filters (low pass, high pass, and more) |
20 | 28 | - spectral filtering |
21 | 29 | - transient/steady-state separation |
22 | - sound file and audio devices read and write access | |
30 | - sound file read and write access | |
23 | 31 | - various mathematics utilities for music applications |
24 | 32 | |
25 | 33 | The name aubio comes from _audio_ with a typo: some errors are likely to be |
28 | 36 | Python module |
29 | 37 | ------------- |
30 | 38 | |
31 | A python module to access the library functions is also provided. Please see | |
32 | the file [`python/README.md`](python/README.md) for more information on how to | |
33 | use it. | |
39 | A python module for aubio is provided. For more information on how to use it, | |
40 | please see the file [`python/README.md`](python/README.md) and the | |
41 | [manual](https://aubio.org/manual/latest/) . | |
34 | 42 | |
35 | Examples tools | |
36 | -------------- | |
43 | Tools | |
44 | ----- | |
37 | 45 | |
38 | A few simple command line tools are included along with the library: | |
46 | The python module comes with the following command line tools: | |
47 | ||
48 | - `aubio` extracts informations from sound files | |
49 | - `aubiocut` slices sound files at onset or beat timestamps | |
50 | ||
51 | Additional command line tools are included along with the library: | |
39 | 52 | |
40 | 53 | - `aubioonset` outputs the time stamp of detected note onsets |
41 | 54 | - `aubiopitch` attempts to identify a fundamental frequency, or pitch, for |
45 | 58 | - `aubionotes` emits midi-like notes, with an onset, a pitch, and a duration |
46 | 59 | - `aubioquiet` extracts quiet and loud regions |
47 | 60 | |
48 | Additionally, the python module comes with the following script: | |
61 | Documentation | |
62 | ------------- | |
49 | 63 | |
50 | - `aubiocut` slices sound files at onset or beat timestamps | |
51 | ||
52 | Implementation and Design Basics | |
53 | -------------------------------- | |
54 | ||
55 | The library is written in C and is optimised for speed and portability. | |
56 | ||
57 | The C API is designed in the following way: | |
58 | ||
59 | aubio_something_t * new_aubio_something (void * args); | |
60 | audio_something_do (aubio_something_t * t, void * args); | |
61 | smpl_t aubio_something_get_a_parameter (aubio_something_t *t); | |
62 | uint_t aubio_something_set_a_parameter (aubio_something_t *t, smpl_t a_parameter); | |
63 | void del_aubio_something (aubio_something_t * t); | |
64 | ||
65 | For performance and real-time operation, no memory allocation or freeing take | |
66 | place in the `_do` methods. Instead, memory allocation should always take place | |
67 | in the `new_` methods, whereas free operations are done in the `del_` methods. | |
64 | - [manual](https://aubio.org/manual/latest/), generated with sphinx | |
65 | - [developer documentation](https://aubio.org/doc/latest/), generated with Doxygen | |
68 | 66 | |
69 | 67 | The latest version of the documentation can be found at: |
70 | 68 | |
73 | 71 | Build Instructions |
74 | 72 | ------------------ |
75 | 73 | |
76 | A number of distributions already include aubio. Check your favorite package | |
77 | management system, or have a look at the [download | |
78 | page](https://aubio.org/download). | |
74 | aubio compiles on Linux, Mac OS X, Windows, Cygwin, and iOS. | |
79 | 75 | |
80 | aubio uses [waf](https://waf.io/) to configure, compile, and test the source: | |
76 | To compile aubio, you should be able to simply run: | |
81 | 77 | |
82 | ./waf configure | |
83 | ./waf build | |
78 | make | |
84 | 79 | |
85 | If waf is not found in the directory, you can download and install it with: | |
80 | To compile the python module: | |
86 | 81 | |
87 | make getwaf | |
82 | ./setup.py build | |
88 | 83 | |
89 | aubio compiles on Linux, Mac OS X, Cygwin, and iOS. | |
84 | See the [manual](https://aubio.org/manual/latest/) for more information about | |
85 | [installing aubio](https://aubio.org/manual/latest/installing.html). | |
90 | 86 | |
91 | Installation | |
92 | ------------ | |
87 | Citation | |
88 | -------- | |
93 | 89 | |
94 | To install aubio library and headers on your system, use: | |
90 | Please use the DOI link above to cite this release in your publications. For | |
91 | more information, see also the [about | |
92 | page](https://aubio.org/manual/latest/about.html) in [aubio | |
93 | manual](https://aubio.org/manual/latest/). | |
95 | 94 | |
96 | sudo ./waf install | |
97 | ||
98 | To uninstall: | |
99 | ||
100 | sudo ./waf uninstall | |
101 | ||
102 | If you don't have root access to install libaubio on your system, you can use | |
103 | libaubio without installing libaubio either by setting `LD_LIBRARY_PATH`, or by | |
104 | copying it to `~/lib`. | |
105 | ||
106 | On Linux, you should be able to set `LD_LIBRARY_PATH` with: | |
107 | ||
108 | $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PWD/build/src | |
109 | ||
110 | On Mac OS X, a copy or a symlink can be made in `~/lib`: | |
111 | ||
112 | $ mkdir -p ~/lib | |
113 | $ ln -sf $PWD/build/src/libaubio*.dylib ~/lib/ | |
114 | ||
115 | Note on Mac OS X systems older than El Capitan (10.11), the `DYLD_LIBRARY_PATH` | |
116 | variable can be set as follows: | |
117 | ||
118 | $ export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:$PWD/build/src | |
119 | ||
120 | Credits and Publications | |
121 | ------------------------ | |
122 | ||
123 | This library gathers music signal processing algorithms designed at the Centre | |
124 | for Digital Music and elsewhere. This software project was developed along the | |
125 | research I did at the Centre for Digital Music, Queen Mary, University of | |
126 | London. Most of this C code was written by myself, starting from published | |
127 | papers and existing code. The header files of each algorithm contains brief | |
128 | descriptions and references to the corresponding papers. | |
129 | ||
130 | Special thanks go Juan Pablo Bello, Chris Duxbury, Samer Abdallah, Alain de | |
131 | Cheveigne for their help and publications. Also many thanks to Miguel Ramirez | |
132 | and Nicolas Wack for their bug fixing. | |
133 | ||
134 | Substantial informations about the algorithms and their evaluation are gathered | |
135 | in: | |
136 | ||
137 | - Paul Brossier, _[Automatic annotation of musical audio for interactive | |
138 | systems](https://aubio.org/phd)_, PhD thesis, Centre for Digital music, | |
139 | Queen Mary University of London, London, UK, 2006. | |
140 | ||
141 | Additional results obtained with this software were discussed in the following | |
142 | papers: | |
143 | ||
144 | - P. M. Brossier and J. P. Bello and M. D. Plumbley, [Real-time temporal | |
145 | segmentation of note objects in music signals](https://aubio.org/articles/brossier04fastnotes.pdf), | |
146 | in _Proceedings of the International Computer Music Conference_, 2004, Miami, | |
147 | Florida, ICMA | |
148 | ||
149 | - P. M. Brossier and J. P. Bello and M. D. Plumbley, [Fast labelling of note | |
150 | objects in music signals] (https://aubio.org/articles/brossier04fastnotes.pdf), | |
151 | in _Proceedings of the International Symposium on Music Information Retrieval_, | |
152 | 2004, Barcelona, Spain | |
153 | ||
154 | ||
155 | Contact Info and Mailing List | |
156 | ----------------------------- | |
95 | Homepage | |
96 | -------- | |
157 | 97 | |
158 | 98 | The home page of this project can be found at: https://aubio.org/ |
159 | 99 | |
160 | Questions, comments, suggestions, and contributions are welcome. Use the | |
161 | mailing list: <aubio-user@aubio.org>. | |
162 | ||
163 | To subscribe to the list, use the mailman form: | |
164 | https://lists.aubio.org/listinfo/aubio-user/ | |
165 | ||
166 | Alternatively, feel free to contact directly the author. | |
167 | ||
168 | ||
169 | Copyright and License Information | |
170 | --------------------------------- | |
171 | ||
172 | Copyright (C) 2003-2016 Paul Brossier <piem@aubio.org> | |
100 | License | |
101 | ------- | |
173 | 102 | |
174 | 103 | aubio is free software: you can redistribute it and/or modify it under the |
175 | 104 | terms of the GNU General Public License as published by the Free Software |
176 | 105 | Foundation, either version 3 of the License, or (at your option) any later |
177 | 106 | version. |
107 | ||
108 | Contributing | |
109 | ------------ | |
110 | ||
111 | Patches are welcome: please fork the latest git repository and create a feature | |
112 | branch. Submitted requests should pass all continuous integration tests. |
0 | 0 | AUBIO_MAJOR_VERSION=0 |
1 | 1 | AUBIO_MINOR_VERSION=4 |
2 | AUBIO_PATCH_VERSION=4 | |
2 | AUBIO_PATCH_VERSION=5 | |
3 | 3 | AUBIO_VERSION_STATUS='' |
4 | 4 | LIBAUBIO_LT_CUR=5 |
5 | LIBAUBIO_LT_REV=1 | |
6 | LIBAUBIO_LT_AGE=5 | |
5 | LIBAUBIO_LT_REV=2 | |
6 | LIBAUBIO_LT_AGE=6 |
0 | dependencies: | |
1 | pre: | |
2 | - sudo apt-get update; sudo apt-get install make sox pkg-config libavcodec-dev libavformat-dev libavresample-dev libavutil-dev libsndfile1-dev libsamplerate-dev | |
3 | ||
4 | test: | |
5 | pre: | |
6 | - make create_test_sounds | |
7 | override: | |
8 | - nose2 -v |
0 | About | |
1 | ===== | |
2 | ||
3 | This library gathers a collection of music signal processing algorithms written | |
4 | by several people. The documentation of each algorithms contains a brief | |
5 | description and references to the corresponding papers. | |
6 | ||
7 | Credits | |
8 | ------- | |
9 | ||
10 | Many thanks to everyone who contributed to aubio, including: | |
11 | ||
12 | - Martin Hermant (`MartinHN <https://github.com/MartinHN>`_) | |
13 | - Eduard Müller (`emuell <https://github.com/emuell>`_) | |
14 | - Nils Philippsen (`nphilipp <https://github.com/nphilipp>`_) | |
15 | - Tres Seaver (`tseaver <https://github.com/tseaver>`_) | |
16 | - Dirkjan Rijnders (`dirkjankrijnders <https://github.com/dirkjankrijnders>`_) | |
17 | - Jeffrey Kern (`anwserman <https:/ /github.com/anwserman>`_) | |
18 | - Sam Alexander (`sxalexander <https://github.com/sxalexander>`_) | |
19 | ||
20 | Special thanks to Juan Pablo Bello, Chris Duxbury, Samer Abdallah, Alain de | |
21 | Cheveigne for their help. Also many thanks to Miguel Ramirez and Nicolas Wack | |
22 | for their advices and help fixing bugs. | |
23 | ||
24 | Publications | |
25 | ------------ | |
26 | ||
27 | Substantial informations about several of the algorithms and their evaluation | |
28 | are gathered in: | |
29 | ||
30 | - Paul Brossier, `Automatic annotation of musical audio for interactive | |
31 | systems <https://aubio.org/phd>`_, PhD thesis, Centre for Digital music, | |
32 | Queen Mary University of London, London, UK, 2006. | |
33 | ||
34 | Additional results obtained with this software were discussed in the following | |
35 | papers: | |
36 | ||
37 | - P. M. Brossier and J. P. Bello and M. D. Plumbley, `Real-time temporal | |
38 | segmentation of note objects in music signals | |
39 | <https://aubio.org/articles/brossier04fastnotes.pdf>`_ in *Proceedings of | |
40 | the International Computer Music Conference*, 2004, Miami, Florida, ICMA | |
41 | ||
42 | - P. M. Brossier and J. P. Bello and M. D. Plumbley, `Fast labelling of note | |
43 | objects in music signals | |
44 | <https://aubio.org/articles/brossier04fastnotes.pdf>`, in *Proceedings of | |
45 | the International Symposium on Music Information Retrieval*, 2004, | |
46 | Barcelona, Spain | |
47 | ||
48 | Citation | |
49 | -------- | |
50 | ||
51 | Please refer to the Zenodo link in the file README.md to cite this release. | |
52 | ||
53 | Copyright | |
54 | --------- | |
55 | ||
56 | Copyright © 2003-2017 Paul Brossier <piem@aubio.org> | |
57 | ||
58 | License | |
59 | ------- | |
60 | ||
61 | aubio is a `free <http://www.debian.org/intro/free>`_ and `open source | |
62 | <http://www.opensource.org/docs/definition.php>`_ software; **you** can | |
63 | redistribute it and/or modify it under the terms of the `GNU | |
64 | <https://www.gnu.org/>`_ `General Public License | |
65 | <https://www.gnu.org/licenses/gpl.html>`_ as published by the `Free Software | |
66 | Foundation <https://fsf.org>`_, either version 3 of the License, or (at your | |
67 | option) any later version. | |
68 | ||
69 | .. note:: | |
70 | ||
71 | aubio is not MIT or BSD licensed. Contact us if you need it in your | |
72 | commercial product. |
0 | 0 | .. _android: |
1 | 1 | |
2 | Building aubio for Android | |
3 | -------------------------- | |
2 | Android build | |
3 | ------------- | |
4 | 4 | |
5 | 5 | To compile aubio for android, you will need to get the `Android Native |
6 | 6 | Development Toolkit (NDK) <https://developer.android.com/ndk/>`_, prepare a |
0 | NAME | |
1 | aubio - a command line tool to extract information from sound files | |
2 | ||
3 | SYNOPSIS | |
4 | ||
5 | aubio [-h] [-V] <command> ... | |
6 | ||
7 | COMMANDS | |
8 | ||
9 | The general syntax is "aubio <command> <soundfile> [options]". The following | |
10 | commands are available: | |
11 | ||
12 | onset get onset times | |
13 | pitch extract fundamental frequency | |
14 | beat get locations of beats | |
15 | tempo get overall tempo in bpm | |
16 | notes get midi-like notes | |
17 | mfcc extract mel-frequency cepstrum coefficients | |
18 | melbands extract mel-frequency energies per band | |
19 | ||
20 | For a list of available commands, use "aubio -h". For more info about each | |
21 | command, use "aubio <command> --help". | |
22 | ||
23 | GENERAL OPTIONS | |
24 | ||
25 | These options can be used before any command has been specified. | |
26 | ||
27 | -h, --help show help message and exit | |
28 | ||
29 | -V, --version show version | |
30 | ||
31 | COMMON OPTIONS | |
32 | ||
33 | The following options can be used with all commands: | |
34 | ||
35 | <source_uri>, -i <source_uri>, --input <source_uri> input sound file to | |
36 | analyse (required) | |
37 | ||
38 | -r <freq>, --samplerate <freq> samplerate at which the file should be | |
39 | represented (default: 0, e.g. samplerate of the input sound) | |
40 | ||
41 | -H <size>, --hopsize <size> overlap size, number of samples between two | |
42 | consecutive analysis (default: 256) | |
43 | ||
44 | -B <size>, --bufsize <size> buffer size, number of samples used for each | |
45 | analysis, (e.g. FFT length, default: 512) | |
46 | ||
47 | -h, --help show help message and exit | |
48 | ||
49 | -T format, --time-format format select time values output format (samples, | |
50 | ms, seconds) (default: seconds) | |
51 | ||
52 | -v, --verbose be verbose (increment verbosity by 1, default: 1) | |
53 | ||
54 | -q, --quiet be quiet (set verbosity to 0) | |
55 | ||
56 | ONSET | |
57 | ||
58 | The following additional options can be used with the "onset" subcommand. | |
59 | ||
60 | -m <method>, --method <method> onset novelty function | |
61 | <default|energy|hfc|complex|phase|specdiff|kl|mkl|specflux> (default: | |
62 | default) | |
63 | ||
64 | -t <threshold>, --threshold <threshold> threshold (default: unset) | |
65 | ||
66 | -s <value>, --silence <value> silence threshold, in dB (default: -70) | |
67 | ||
68 | -M <value>, --minioi <value> minimum Inter-Onset Interval (default: 12ms) | |
69 | ||
70 | PITCH | |
71 | ||
72 | The following additional options can be used with the "pitch" subcommand. | |
73 | ||
74 | -m <method>, --method <method> pitch detection method | |
75 | <default|yinfft|yin|mcomb|fcomb|schmitt> (default: default, e.g. yinfft) | |
76 | ||
77 | -t <threshold>, --threshold <threshold> tolerance (default: unset) | |
78 | ||
79 | -s <value>, --silence <value> silence threshold, in dB (default: -70) | |
80 | ||
81 | The default buffer size for the beat algorithm is 2048. The default hop size | |
82 | is 256. | |
83 | ||
84 | BEAT | |
85 | ||
86 | The "beat" command accepts all common options and no additional options. | |
87 | ||
88 | The default buffer size for the beat algorithm is 1024. The default hop size | |
89 | is 512. | |
90 | ||
91 | TEMPO | |
92 | ||
93 | The "tempo" command accepts all common options and no additional options. | |
94 | ||
95 | The default buffer size for the beat algorithm is 1024. The default hop size | |
96 | is 512. | |
97 | ||
98 | NOTES | |
99 | ||
100 | The "note" command accepts all common options and no additional options. | |
101 | ||
102 | MFCC | |
103 | ||
104 | The "mfcc" command accepts all common options and no additional options. | |
105 | ||
106 | MELBANDS | |
107 | ||
108 | The "melbands" command accepts all common options and no additional options. | |
109 | ||
110 | EXAMPLES | |
111 | ||
112 | Extract onsets using a minimum inter-onset interval of 30ms: | |
113 | ||
114 | aubio onset /path/to/input_file -M 30ms | |
115 | ||
116 | Extract pitch with method "mcomb" and a silence threshold of -90dB: | |
117 | ||
118 | aubio pitch /path/to/input_file -m mcomb -s -90.0 | |
119 | ||
120 | Extract MFCC using the standard Slaney implementation: | |
121 | ||
122 | aubio mfcc /path/to/input_file -r 44100 | |
123 | ||
124 | ||
125 | SEE ALSO | |
126 | ||
127 | aubiocut(1) | |
128 | ||
129 | AUTHOR | |
130 | ||
131 | This manual page was written by Paul Brossier <piem@aubio.org>. Permission is | |
132 | granted to copy, distribute and/or modify this document under the terms of | |
133 | the GNU General Public License as published by the Free Software Foundation, | |
134 | either version 3 of the License, or (at your option) any later version. |
8 | 8 | [-O method] [-t thres] |
9 | 9 | [-T time-format] |
10 | 10 | [-s sil] [-m] [-f] |
11 | [-j] [-v] [-h] | |
11 | [-j] [-N miditap-note] [-V miditap-velo] | |
12 | [-v] [-h] | |
12 | 13 | |
13 | 14 | DESCRIPTION |
14 | 15 | |
68 | 69 | |
69 | 70 | -j, --jack Use Jack input/output. You will need a Jack connection |
70 | 71 | controller to feed aubio some signal and listen to its output. |
72 | ||
73 | -N, --miditap-note Override note value for MIDI tap. Defaults to 69. | |
74 | ||
75 | -V, --miditap-velop Override velocity value for MIDI tap. Defaults to 65. | |
71 | 76 | |
72 | 77 | -h, --help Print a short help message and exit. |
73 | 78 |
7 | 7 | [-r rate] [-B win] [-H hop] |
8 | 8 | [-T time-format] |
9 | 9 | [-s sil] [-m] |
10 | [-j] [-v] [-h] | |
10 | [-j] [-N miditap-note] [-V miditap-velo] | |
11 | [-v] [-h] | |
11 | 12 | |
12 | 13 | DESCRIPTION |
13 | 14 | |
53 | 54 | -j, --jack Use Jack input/output. You will need a Jack connection |
54 | 55 | controller to feed aubio some signal and listen to its output. |
55 | 56 | |
57 | -N, --miditap-note Override note value for MIDI tap. Defaults to 69. | |
58 | ||
59 | -V, --miditap-velop Override velocity value for MIDI tap. Defaults to 65. | |
60 | ||
56 | 61 | -T, --timeformat format Set time format (samples, ms, seconds). Defaults to |
57 | 62 | seconds. |
58 | 63 |
0 | Pre-compiled binaries | |
1 | --------------------- | |
2 | ||
3 | `Pre-compiled binaries <https://aubio.org/download>`_ | |
4 | are available for | |
5 | `macOS <https://aubio.org/download#osx>`_, | |
6 | `iOS <https://aubio.org/download#ios>`_, | |
7 | and | |
8 | `windows <https://aubio.org/download#win>`_ | |
9 | ||
10 | To use aubio in a macOS or iOS application, see :ref:`xcode-frameworks-label`. | |
11 |
19 | 19 | |
20 | 20 | The **latest stable release** can be downloaded from https://aubio.org/download:: |
21 | 21 | |
22 | $ curl -O http://aubio.org/pub/aubio-0.4.3.tar.bz2 | |
23 | $ tar xf aubio-0.4.3.tar.bz2 | |
24 | $ cd aubio-0.4.3 | |
22 | $ curl -O http://aubio.org/pub/aubio-<version>.tar.bz2 | |
23 | $ tar xf aubio-<version>.tar.bz2 | |
24 | $ cd aubio-<version>/ | |
25 | 25 | |
26 | 26 | Git repository |
27 | 27 | -------------- |
29 | 29 | The **latest git branch** can be obtained with:: |
30 | 30 | |
31 | 31 | $ git clone git://git.aubio.org/git/aubio |
32 | $ cd aubio | |
32 | $ cd aubio/ | |
33 | 33 | |
34 | 34 | The following command will fetch the correct `waf`_ version (not included in |
35 | 35 | aubio's git):: |
72 | 72 | |
73 | 73 | $ waf configure build |
74 | 74 | |
75 | ||
76 | Running as a user | |
77 | ----------------- | |
78 | ||
79 | To use aubio without actually installing, for instance if you don't have root | |
80 | access to install libaubio on your system, | |
81 | ||
82 | On Linux or macOS, sourcing the script ``scripts/setenv_local.sh`` should help:: | |
83 | ||
84 | $ source ./scripts/setenv_local.sh | |
85 | ||
86 | This script sets ``LD_LIBRARY_PATH``, for libaubio, and ``PYTHONPATH`` for the | |
87 | python module. | |
88 | ||
89 | On Linux, you should be able to set ``LD_LIBRARY_PATH`` with:: | |
90 | ||
91 | $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$PWD/build/src | |
92 | ||
93 | On Mac OS X, a copy or a symlink can be made in ``~/lib``:: | |
94 | ||
95 | $ mkdir -p ~/lib | |
96 | $ ln -sf $PWD/build/src/libaubio*.dylib ~/lib/ | |
97 | ||
98 | Note on Mac OS X systems older than El Capitan (10.11), the ``DYLD_LIBRARY_PATH`` | |
99 | variable can be set as follows:: | |
100 | ||
101 | $ export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:$PWD/build/src | |
102 | ||
75 | 103 | Cleaning |
76 | 104 | -------- |
77 | 105 | |
93 | 121 | |
94 | 122 | .. _Git Bash: https://git-for-windows.github.io/ |
95 | 123 | |
96 | .. toctree:: | |
97 | :maxdepth: 2 | |
124 | .. _xcode-frameworks-label: | |
125 | ||
126 | .. include:: xcode_frameworks.rst | |
127 | ||
128 | .. include:: android.rst |
2 | 2 | Command line tools |
3 | 3 | ================== |
4 | 4 | |
5 | A few simple command line tools are included along with the library. | |
5 | The python module comes with the following tools: | |
6 | ||
7 | - ``aubio`` estimate and extract descriptors from sound files | |
8 | - ``aubiocut`` slices sound files at onset or beat timestamps | |
9 | ||
10 | More command line tools are included along with the library. | |
6 | 11 | |
7 | 12 | - ``aubioonset`` outputs the time stamp of detected note onsets |
8 | 13 | - ``aubiopitch`` attempts to identify a fundamental frequency, or pitch, for |
12 | 17 | - ``aubionotes`` emits midi-like notes, with an onset, a pitch, and a duration |
13 | 18 | - ``aubioquiet`` extracts quiet and loud regions |
14 | 19 | |
15 | Additionally, the python module comes with the following script: | |
16 | 20 | |
17 | - ``aubiocut`` slices sound files at onset or beat timestamps | |
21 | ``aubio`` | |
22 | --------- | |
23 | ||
24 | .. literalinclude:: aubio.txt | |
25 | :language: text | |
18 | 26 | |
19 | 27 | |
20 | .. toctree:: | |
28 | ``aubiocut`` | |
29 | -------------- | |
21 | 30 | |
22 | cli_features | |
31 | .. literalinclude:: aubiocut.txt | |
32 | :language: text | |
23 | 33 | |
24 | 34 | |
25 | 35 | ``aubioonset`` |
26 | 36 | -------------- |
27 | 37 | |
28 | 38 | .. literalinclude:: aubioonset.txt |
39 | :language: text | |
29 | 40 | |
30 | 41 | ``aubiopitch`` |
31 | 42 | -------------- |
32 | 43 | |
33 | 44 | .. literalinclude:: aubiopitch.txt |
45 | :language: text | |
34 | 46 | |
35 | 47 | ``aubiomfcc`` |
36 | 48 | -------------- |
37 | 49 | |
38 | 50 | .. literalinclude:: aubiomfcc.txt |
51 | :language: text | |
39 | 52 | |
40 | 53 | ``aubiotrack`` |
41 | 54 | -------------- |
42 | 55 | |
43 | 56 | .. literalinclude:: aubiotrack.txt |
57 | :language: text | |
44 | 58 | |
45 | 59 | ``aubionotes`` |
46 | 60 | -------------- |
47 | 61 | |
48 | 62 | .. literalinclude:: aubionotes.txt |
63 | :language: text | |
49 | 64 | |
50 | 65 | ``aubioquiet`` |
51 | 66 | -------------- |
52 | 67 | |
53 | 68 | .. literalinclude:: aubioquiet.txt |
69 | :language: text | |
54 | 70 | |
55 | ``aubiocut`` | |
56 | -------------- | |
57 | 71 | |
58 | .. literalinclude:: aubiocut.txt | |
72 | .. include:: cli_features.rst |
0 | 0 | Command line features |
1 | ===================== | |
1 | --------------------- | |
2 | 2 | |
3 | 3 | +--------------+-------+-------+------+-------+-------+-------+------+------------------+ |
4 | 4 | | feat vs. prg | onset | pitch | mfcc | track | notes | quiet | cut1 | short options | |
11 | 11 | # serve to show the default. |
12 | 12 | |
13 | 13 | import sys, os |
14 | ||
15 | # get version using this_version.py | |
16 | sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '..')) | |
17 | from this_version import get_aubio_version | |
14 | 18 | |
15 | 19 | # If extensions (or modules to document with autodoc) are in another directory, |
16 | 20 | # add these directories to sys.path here. If the directory is relative to the |
47 | 51 | # built documents. |
48 | 52 | # |
49 | 53 | # The short X.Y version. |
50 | version = '0.4' | |
54 | ||
55 | version = get_aubio_version()[:3] | |
51 | 56 | # The full version, including alpha/beta/rc tags. |
52 | release = '0.4.4' | |
57 | release = get_aubio_version() | |
53 | 58 | |
54 | 59 | # The language for content autogenerated by Sphinx. Refer to documentation |
55 | 60 | # for a list of supported languages. |
63 | 68 | |
64 | 69 | # List of patterns, relative to source directory, that match files and |
65 | 70 | # directories to ignore when looking for source files. |
66 | exclude_patterns = ['_build'] | |
71 | exclude_patterns = ['_build', | |
72 | 'statuslinks.rst', | |
73 | 'download.rst', | |
74 | 'binaries.rst', | |
75 | 'debian_packages.rst', | |
76 | 'building.rst', | |
77 | 'android.rst', | |
78 | 'xcode_frameworks.rst', | |
79 | 'requirements.rst', | |
80 | 'cli_features.rst', | |
81 | ] | |
67 | 82 | |
68 | 83 | # The reST default role (used for this markup: `text`) to use for all documents. |
69 | 84 | #default_role = None |
242 | 257 | |
243 | 258 | # How to display URL addresses: 'footnote', 'no', or 'inline'. |
244 | 259 | #texinfo_show_urls = 'footnote' |
260 | ||
261 | def setup(app): | |
262 | if release.endswith('~alpha'): app.tags.add('devel') |
2 | 2 | Developping with aubio |
3 | 3 | ====================== |
4 | 4 | |
5 | Read `Contribute`_ to report issues and request new features. | |
5 | Here is a brief overview of the C library. | |
6 | 6 | |
7 | See `Doxygen documentation`_ for the complete documentation of the C library, | |
8 | built using `Doxygen <http://www.doxygen.org/>`_. | |
7 | For a more detailed list of available functions, see the `API documentation | |
8 | <https://aubio.org/doc/latest/>`_. | |
9 | 9 | |
10 | Below is a brief `Library overview`_. | |
10 | To report issues, ask questions, and request new features, use `Github Issues | |
11 | <https://github.com/aubio/aubio/issues>`_ | |
11 | 12 | |
12 | Library overview | |
13 | ---------------- | |
13 | Design Basics | |
14 | ------------- | |
14 | 15 | |
15 | Here is a brief overview of the C library. See also the `Doxygen | |
16 | documentation`_ for a more detailed list of available functions. | |
16 | The library is written in C and is optimised for speed and portability. | |
17 | 17 | |
18 | Vectors and matrix | |
19 | `````````````````` | |
20 | ||
21 | ``fvec_t`` are used to hold vectors of float (``smpl_t``). | |
22 | ||
23 | .. literalinclude:: ../tests/src/test-fvec.c | |
24 | :language: C | |
25 | :lines: 7 | |
26 | ||
18 | All memory allocations take place in the `new_` methods. Each successful call | |
19 | to `new_` should have a matching call to `del_` to deallocate the object. | |
27 | 20 | |
28 | 21 | .. code-block:: C |
29 | 22 | |
30 | // set some elements | |
31 | vec->data[511] = 2.; | |
32 | vec->data[vec->length-2] = 1.; | |
23 | // new_ to create an object foobar | |
24 | aubio_foobar_t * new_aubio_foobar(void * args); | |
25 | // del_ to delete foobar | |
26 | void del_aubio_foobar (aubio_foobar_t * foobar); | |
33 | 27 | |
34 | Similarly, ``fmat_t`` are used to hold matrix of floats. | |
28 | The main computations are done in the `_do` methods. | |
35 | 29 | |
36 | .. literalinclude:: ../tests/src/test-fmat.c | |
37 | :language: C | |
38 | :lines: 9-19 | |
30 | .. code-block:: C | |
31 | ||
32 | // _do to process output = foobar(input) | |
33 | audio_foobar_do (aubio_foobar_t * foobar, fvec_t * input, cvec_t * output); | |
34 | ||
35 | Most parameters can be read and written at any time: | |
36 | ||
37 | .. code-block:: C | |
38 | ||
39 | // _get_param to get foobar.param | |
40 | smpl_t aubio_foobar_get_a_parameter (aubio_foobar_t * foobar); | |
41 | // _set_param to set foobar.param | |
42 | uint_t aubio_foobar_set_a_parameter (aubio_foobar_t * foobar, smpl_t a_parameter); | |
43 | ||
44 | In some case, more functions are available: | |
45 | ||
46 | .. code-block:: C | |
47 | ||
48 | // non-real time functions | |
49 | uint_t aubio_foobar_reset(aubio_foobar_t * t); | |
50 | ||
51 | Basic Types | |
52 | ----------- | |
53 | ||
54 | .. code-block:: C | |
55 | ||
56 | // integers | |
57 | uint_t n = 10; // unsigned | |
58 | sint_t delay = -90; // signed | |
59 | ||
60 | // float | |
61 | smpl_t a = -90.; // simple precision | |
62 | lsmp_t f = 0.024; // double precision | |
63 | ||
64 | // vector of floats (simple precision) | |
65 | fvec_t * vec = new_fvec(n); | |
66 | vec->data[0] = 1; | |
67 | vec->data[vec->length-1] = 1.; // vec->data has n elements | |
68 | fvec_print(vec); | |
69 | del_fvec(vec); | |
70 | ||
71 | // complex data | |
72 | cvec_t * fftgrain = new_cvec(n); | |
73 | vec->norm[0] = 1.; // vec->norm has n/2+1 elements | |
74 | vec->phas[n/2] = 3.1415; // vec->phas as well | |
75 | del_cvec(fftgrain); | |
76 | ||
77 | // matrix | |
78 | fmat_t * mat = new_fmat (height, length); | |
79 | mat->data[height-1][0] = 1; // mat->data has height rows | |
80 | mat->data[0][length-1] = 10; // mat->data[0] has length columns | |
81 | del_fmat(mat); | |
82 | ||
39 | 83 | |
40 | 84 | Reading a sound file |
41 | ```````````````````` | |
42 | In this example, ``aubio_source`` is used to read a media file. | |
85 | -------------------- | |
43 | 86 | |
44 | First, create the objects we need. | |
87 | In this example, `aubio_source <https://aubio.org/doc/latest/source_8h.html>`_ | |
88 | is used to read a media file. | |
89 | ||
90 | First, define a few variables and allocate some memory. | |
45 | 91 | |
46 | 92 | .. literalinclude:: ../tests/src/io/test-source.c |
47 | 93 | :language: C |
57 | 103 | :language: C |
58 | 104 | :lines: 40-44 |
59 | 105 | |
60 | At the end of the processing loop, clean-up and de-allocate memory: | |
106 | At the end of the processing loop, memory is deallocated: | |
61 | 107 | |
62 | 108 | .. literalinclude:: ../tests/src/io/test-source.c |
63 | 109 | :language: C |
64 | :lines: 50-56 | |
110 | :lines: 55-56 | |
65 | 111 | |
66 | 112 | See the complete example: :download:`test-source.c |
67 | 113 | <../tests/src/io/test-source.c>`. |
68 | 114 | |
69 | Computing the spectrum | |
70 | `````````````````````` | |
115 | Computing a spectrum | |
116 | -------------------- | |
71 | 117 | |
72 | 118 | Now let's create a phase vocoder: |
73 | 119 | |
79 | 125 | |
80 | 126 | .. literalinclude:: ../tests/src/spectral/test-phasevoc.c |
81 | 127 | :language: C |
82 | :lines: 21-35 | |
128 | :lines: 20-37 | |
129 | ||
130 | Time to clean up the previously allocated memory: | |
131 | ||
132 | .. literalinclude:: ../tests/src/spectral/test-phasevoc.c | |
133 | :language: C | |
134 | :lines: 39-44 | |
83 | 135 | |
84 | 136 | See the complete example: :download:`test-phasevoc.c |
85 | 137 | <../tests/src/spectral/test-phasevoc.c>`. |
89 | 141 | Doxygen documentation |
90 | 142 | --------------------- |
91 | 143 | |
92 | The latest version of the doxygen documentation is available at: | |
144 | The latest version of the API documentation is built using `Doxygen | |
145 | <http://www.doxygen.org/>`_ and is available at: | |
93 | 146 | |
94 | https://aubio.org/doc/latest | |
147 | https://aubio.org/doc/latest/ | |
95 | 148 | |
96 | 149 | Contribute |
97 | 150 | ---------- |
98 | 151 | |
99 | 152 | Please report any issue and feature request at the `Github issue tracker |
100 | 153 | <https://github.com/aubio/aubio/issues>`_. Patches and pull-requests welcome! |
101 |
4 | 4 | |
5 | 5 | A number of distributions already include aubio. Check your favorite package |
6 | 6 | management system, or have a look at the `aubio download page |
7 | <http://aubio.org/download>`_ for more options. | |
8 | ||
9 | To use aubio in a macOS or iOS application, see :ref:`xcode-frameworks-label`. | |
7 | <https://aubio.org/download>`_ for more options. | |
10 | 8 | |
11 | 9 | To use aubio in an android project, see :ref:`android`. |
12 | 10 | |
13 | .. toctree:: | |
11 | To compile aubio from source, read :ref:`building`. | |
14 | 12 | |
15 | debian_packages | |
16 | xcode_frameworks | |
17 | android | |
13 | .. include:: binaries.rst | |
18 | 14 | |
19 | To compile aubio from source, read :ref:`building`. | |
15 | .. include:: debian_packages.rst |
16 | 16 | * :ref:`manpages` |
17 | 17 | * :ref:`develop` |
18 | 18 | * :ref:`building` |
19 | ||
20 | .. only:: devel | |
21 | ||
22 | .. include:: statuslinks.rst | |
19 | 23 | |
20 | 24 | Project pages |
21 | 25 | ============= |
57 | 61 | The name aubio comes from *audio* with a typo: some errors are likely to be |
58 | 62 | found in the results. |
59 | 63 | |
60 | Copyright | |
61 | ========= | |
62 | ||
63 | Copyright © 2003-2016 Paul Brossier <piem@aubio.org> | |
64 | ||
65 | License | |
66 | ======= | |
67 | ||
68 | aubio is a `free <http://www.debian.org/intro/free>`_ and `open source | |
69 | <http://www.opensource.org/docs/definition.php>`_ software; **you** can | |
70 | redistribute it and/or modify it under the terms of the `GNU | |
71 | <https://www.gnu.org/>`_ `General Public License | |
72 | <https://www.gnu.org/licenses/gpl.html>`_ as published by the `Free Software | |
73 | Foundation <https://fsf.org>`_, either version 3 of the License, or (at your | |
74 | option) any later version. | |
75 | ||
76 | .. note:: | |
77 | ||
78 | aubio is not MIT or BSD licensed. Contact the author if you need it in your | |
79 | commercial product. | |
80 | ||
81 | 64 | Content |
82 | 65 | ======= |
83 | 66 | |
88 | 71 | python_module |
89 | 72 | cli |
90 | 73 | develop |
74 | about |
3 | 3 | aubio runs on Linux, Windows, macOS, iOS, Android, and probably a few others |
4 | 4 | operating systems. |
5 | 5 | |
6 | To download a pre-compiled version of the library, head to :ref:`download`. | |
6 | Aubio is available as a C library and as a python module. | |
7 | 7 | |
8 | To install the python extension, head to :ref:`python`. | |
8 | Cheat sheet | |
9 | ----------- | |
9 | 10 | |
10 | To compile aubio form source, first check the :ref:`requirements`, then read | |
11 | :ref:`building`. | |
11 | - :ref:`get aubio latest source code <building>`:: | |
12 | 12 | |
13 | .. toctree:: | |
14 | :maxdepth: 2 | |
13 | # official repo | |
14 | git clone https://git.aubio.org/aubio/aubio | |
15 | # mirror | |
16 | git clone https://github.com/aubio/aubio | |
17 | # latest release | |
18 | wget https://aubio.org/pub/aubio-<version>.tar.gz | |
15 | 19 | |
16 | download | |
17 | building | |
18 | requirements | |
20 | ||
21 | - :ref:`build aubio from source <building>`:: | |
22 | ||
23 | # 1. simple | |
24 | cd aubio | |
25 | make | |
26 | ||
27 | # 2. step by step | |
28 | ./scripts/get_waf.sh | |
29 | ./waf configure | |
30 | ./waf build | |
31 | sudo ./waf install | |
32 | ||
33 | - :ref:`install python-aubio from source <python>`:: | |
34 | ||
35 | # from git | |
36 | pip install git+https://git.aubio.org/aubio/aubio/ | |
37 | # mirror | |
38 | pip install git+https://github.com/aubio/aubio/ | |
39 | # from latest release | |
40 | pip install https://aubio.org/pub/aubio-latest.tar.bz2 | |
41 | # from pypi | |
42 | pip install aubio | |
43 | # from source directory | |
44 | cd aubio | |
45 | pip install -v . | |
46 | ||
47 | - :ref:`install python-aubio from a pre-compiled binary <python>`:: | |
48 | ||
49 | # conda [osx, linux, win] | |
50 | conda install -c conda-forge aubio | |
51 | # .deb (debian, ubuntu) [linux] | |
52 | sudo apt-get install python3-aubio python-aubio aubio-tools | |
53 | # brew [osx] | |
54 | brew install aubio --with-python | |
55 | ||
56 | - :ref:`get a pre-compiled version of libaubio <download>`:: | |
57 | ||
58 | # .deb (linux) WARNING: old version | |
59 | sudo apt-get install aubio-tools | |
60 | ||
61 | # python module | |
62 | ./setup.py install | |
63 | # using pip | |
64 | pip install . | |
65 | ||
66 | - :ref:`check the list of optional dependencies <requirements>`:: | |
67 | ||
68 | # debian / ubuntu | |
69 | dpkg -l libavcodec-dev libavutil-dev libavformat-dev \ | |
70 | libswresample-dev libavresample-dev \ | |
71 | libsamplerate-dev libsndfile-dev \ | |
72 | txt2man doxygen | |
73 | ||
74 | .. include:: download.rst | |
75 | ||
76 | .. include:: building.rst | |
77 | ||
78 | .. include:: requirements.rst |
0 | Current status | |
1 | ============== | |
2 | ||
3 | .. image:: https://travis-ci.org/aubio/aubio.svg?branch=master | |
4 | :target: https://travis-ci.org/aubio/aubio | |
5 | :alt: Travis build status | |
6 | ||
7 | .. image:: https://ci.appveyor.com/api/projects/status/f3lhy3a57rkgn5yi?svg=true | |
8 | :target: https://ci.appveyor.com/project/piem/aubio/ | |
9 | :alt: Appveyor build status | |
10 | ||
11 | .. image:: https://landscape.io/github/aubio/aubio/master/landscape.svg?style=flat | |
12 | :target: https://landscape.io/github/aubio/aubio/master | |
13 | :alt: Landscape code health | |
14 | ||
15 | .. image:: https://readthedocs.org/projects/aubio/badge/?version=latest | |
16 | :target: https://aubio.readthedocs.io/en/latest/?badge=latest | |
17 | :alt: Documentation status | |
18 | ||
19 | .. image:: https://img.shields.io/github/commits-since/aubio/aubio/0.4.4.svg?maxAge=2592000 | |
20 | :target: https://github.com/aubio/aubio | |
21 | :alt: Commits since last release | |
22 | ||
23 |
37 | 37 | # could be handy for archiving the generated documentation or if some version |
38 | 38 | # control system is used. |
39 | 39 | |
40 | PROJECT_NUMBER = "0.4.4" | |
40 | PROJECT_NUMBER = "latest" | |
41 | 41 | |
42 | 42 | # Using the PROJECT_BRIEF tag one can provide an optional one line description |
43 | 43 | # for a project that appears at the top of each page and should give viewer a |
0 | .. _xcode-frameworks-label: | |
1 | ||
2 | Using aubio frameworks in Xcode | |
3 | ------------------------------- | |
0 | Frameworks for Xcode | |
1 | -------------------- | |
4 | 2 | |
5 | 3 | `Binary frameworks`_ are available and ready to use in your XCode project, for |
6 | 4 | `iOS`_ and `macOS`_. |
33 | 31 | |
34 | 32 | import aubio |
35 | 33 | |
34 | Using aubio from swift | |
35 | ---------------------- | |
36 | ||
37 | Once you have downloaded and installed :ref:`aubio.framework | |
38 | <xcode-frameworks-label>`, you sould be able to use aubio from C, Obj-C, and | |
39 | Swift source files. | |
40 | ||
41 | ||
42 | Here is a short example showing how to read a sound file in swift: | |
43 | ||
44 | ||
45 | .. code-block:: swift | |
46 | ||
47 | import aubio | |
48 | ||
49 | let path = Bundle.main.path(forResource: "example", ofType: "mp4") | |
50 | if (path != nil) { | |
51 | let hop_size : uint_t = 512 | |
52 | let a = new_fvec(hop_size) | |
53 | let b = new_aubio_source(path, 0, hop_size) | |
54 | var read: uint_t = 0 | |
55 | var total_frames : uint_t = 0 | |
56 | while (true) { | |
57 | aubio_source_do(b, a, &read) | |
58 | total_frames += read | |
59 | if (read < hop_size) { break } | |
60 | } | |
61 | print("read", total_frames, "frames at", aubio_source_get_samplerate(b), "Hz") | |
62 | del_aubio_source(b) | |
63 | del_fvec(a) | |
64 | } else { | |
65 | print("could not find file") | |
66 | } | |
67 | ||
68 | ||
36 | 69 | .. _Binary frameworks: https://aubio.org/download |
37 | 70 | .. _iOS: https://aubio.org/download#ios |
38 | 71 | .. _macOS: https://aubio.org/download#osx |
39 | .. _Download: https://aubio.org/download |
42 | 42 | } else { |
43 | 43 | aubio_wavetable_stop ( wavetable ); |
44 | 44 | } |
45 | if (mix_input) | |
45 | if (mix_input) { | |
46 | 46 | aubio_wavetable_do (wavetable, ibuf, obuf); |
47 | else | |
47 | } else { | |
48 | 48 | aubio_wavetable_do (wavetable, obuf, obuf); |
49 | } | |
49 | 50 | } |
50 | 51 | |
51 | 52 | void process_print (void) |
60 | 61 | int ret = 0; |
61 | 62 | examples_common_init(argc,argv); |
62 | 63 | |
63 | verbmsg ("using source: %s at %dHz\n", source_uri, samplerate); | |
64 | verbmsg ("onset method: %s, ", onset_method); | |
65 | verbmsg ("buffer_size: %d, ", buffer_size); | |
66 | verbmsg ("hop_size: %d, ", hop_size); | |
67 | verbmsg ("silence: %f, ", silence_threshold); | |
68 | verbmsg ("threshold: %f\n", onset_threshold); | |
69 | ||
70 | 64 | o = new_aubio_onset (onset_method, buffer_size, hop_size, samplerate); |
71 | 65 | if (o == NULL) { ret = 1; goto beach; } |
72 | 66 | if (onset_threshold != 0.) |
75 | 69 | aubio_onset_set_silence (o, silence_threshold); |
76 | 70 | if (onset_minioi != 0.) |
77 | 71 | aubio_onset_set_minioi_s (o, onset_minioi); |
72 | ||
73 | verbmsg ("using source: %s at %dHz\n", source_uri, samplerate); | |
74 | verbmsg ("onset method: %s, ", onset_method); | |
75 | verbmsg ("buffer_size: %d, ", buffer_size); | |
76 | verbmsg ("hop_size: %d, ", hop_size); | |
77 | verbmsg ("silence: %f, ", aubio_onset_get_silence(o)); | |
78 | verbmsg ("threshold: %f, ", aubio_onset_get_threshold(o)); | |
79 | verbmsg ("awhitening: %f, ", aubio_onset_get_awhitening(o)); | |
80 | verbmsg ("compression: %f\n", aubio_onset_get_compression(o)); | |
78 | 81 | |
79 | 82 | onset = new_fvec (1); |
80 | 83 |
45 | 45 | } else { |
46 | 46 | aubio_wavetable_stop ( wavetable ); |
47 | 47 | } |
48 | if (mix_input) | |
48 | if (mix_input) { | |
49 | 49 | aubio_wavetable_do (wavetable, ibuf, obuf); |
50 | else | |
50 | } else { | |
51 | 51 | aubio_wavetable_do (wavetable, obuf, obuf); |
52 | } | |
52 | 53 | } |
53 | 54 | |
54 | 55 | void process_print (void) { |
116 | 116 | #endif /* PROG_HAS_OUTPUT */ |
117 | 117 | #ifdef PROG_HAS_JACK |
118 | 118 | " -j --jack use Jack\n" |
119 | #if defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) | |
120 | " -N --miditap-note MIDI note; default=69.\n" | |
121 | " -V --miditap-velo MIDI velocity; default=65.\n" | |
122 | #endif /* defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) */ | |
119 | 123 | #endif /* PROG_HAS_JACK */ |
120 | 124 | " -v --verbose be verbose\n" |
121 | 125 | " -h --help display this message\n" |
135 | 139 | "i:r:B:H:" |
136 | 140 | #ifdef PROG_HAS_JACK |
137 | 141 | "j" |
142 | #if defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) | |
143 | "N:V:" | |
144 | #endif /* defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) */ | |
138 | 145 | #endif /* PROG_HAS_JACK */ |
139 | 146 | #ifdef PROG_HAS_OUTPUT |
140 | 147 | "o:" |
163 | 170 | {"hopsize", 1, NULL, 'H'}, |
164 | 171 | #ifdef PROG_HAS_JACK |
165 | 172 | {"jack", 0, NULL, 'j'}, |
173 | #if defined(PROG_HAS_ONSET) && !defined(PROG_HAS_PITCH) | |
174 | {"miditap-note", 1, NULL, 'N'}, | |
175 | {"miditap-velo", 1, NULL, 'V'}, | |
176 | #endif /* PROG_HAS_ONSET !PROG_HAS_PITCH */ | |
166 | 177 | #endif /* PROG_HAS_JACK */ |
167 | 178 | #ifdef PROG_HAS_OUTPUT |
168 | 179 | {"output", 1, NULL, 'o'}, |
206 | 217 | case 'j': |
207 | 218 | usejack = 1; |
208 | 219 | break; |
220 | case 'N': | |
221 | miditap_note = (smpl_t) atoi (optarg); | |
222 | break; | |
223 | case 'V': | |
224 | miditap_velo = (smpl_t) atoi (optarg); | |
225 | break; | |
209 | 226 | case 'i': |
210 | 227 | source_uri = optarg; |
211 | 228 | break; |
75 | 75 | |
76 | 76 | #if HAVE_JACK |
77 | 77 | aubio_jack_t *jack_setup; |
78 | jack_midi_event_t ev; | |
78 | 79 | #endif /* HAVE_JACK */ |
79 | 80 | |
80 | 81 | void examples_common_init (int argc, char **argv); |
126 | 127 | |
127 | 128 | void examples_common_del (void) |
128 | 129 | { |
130 | #ifdef HAVE_JACK | |
131 | if (ev.buffer) free(ev.buffer); | |
132 | #endif | |
129 | 133 | del_fvec (ibuf); |
130 | 134 | del_fvec (obuf); |
131 | 135 | aubio_cleanup (); |
141 | 145 | if (usejack) { |
142 | 146 | |
143 | 147 | #ifdef HAVE_JACK |
148 | ev.size = 3; | |
149 | ev.buffer = malloc (3 * sizeof (jack_midi_data_t)); | |
150 | ev.time = 0; // send it now | |
144 | 151 | debug ("Jack activation ...\n"); |
145 | 152 | aubio_jack_activate (jack_setup, process_func); |
146 | 153 | debug ("Processing (Ctrl+C to quit) ...\n"); |
184 | 191 | send_noteon (smpl_t pitch, smpl_t velo) |
185 | 192 | { |
186 | 193 | #ifdef HAVE_JACK |
187 | jack_midi_event_t ev; | |
188 | ev.size = 3; | |
189 | ev.buffer = malloc (3 * sizeof (jack_midi_data_t)); // FIXME | |
190 | ev.time = 0; | |
191 | 194 | if (usejack) { |
192 | 195 | ev.buffer[2] = velo; |
193 | 196 | ev.buffer[1] = pitch; |
0 | #! /usr/bin/env python | |
1 | ||
2 | import numpy as np | |
3 | from aubio import pitch | |
4 | import pylab as plt | |
5 | ||
6 | buf_size = 2048 * 1 | |
7 | hop_size = buf_size // 4 | |
8 | ||
9 | samplerate = 44100 | |
10 | minfreq = 40 | |
11 | maxfreq = 6000 | |
12 | ||
13 | def sinewave(freq, duration, samplerate = samplerate): | |
14 | """ generate a sinewave """ | |
15 | length = hop_size | |
16 | while length < duration * samplerate: | |
17 | length += hop_size | |
18 | return np.sin( 2. * np.pi * np.arange(length) * freq / samplerate ).astype("float32") | |
19 | ||
20 | def get_stats_for_pitch_method(method, freqs, samplerate = samplerate): | |
21 | """ for a given pitch method and a list of frequency, generate a sinewave | |
22 | and get mean deviation """ | |
23 | means = np.zeros(len(freqs)) | |
24 | medians = np.zeros(len(freqs)) | |
25 | for freq, fn in zip(freqs, range(len(freqs))): | |
26 | s = sinewave(freq, .50).reshape(-1, hop_size) | |
27 | #s = (sinewave(freq, .50) + .0*sinewave(freq/2., .50)).reshape(-1, hop_size) | |
28 | p = pitch(method, buf_size, hop_size, samplerate = samplerate) | |
29 | candidates = np.zeros(len(s)) | |
30 | #samples = np.zeros(buf_size) | |
31 | for frame, i in zip(s, range(len(s))): | |
32 | candidates[i] = p(frame)[0] | |
33 | # skip first few candidates | |
34 | candidates = candidates[4:] | |
35 | means[fn] = np.mean(candidates[candidates != 0] - freq) | |
36 | medians[fn] = np.median(candidates[candidates != 0] - freq) | |
37 | print (freq, means[fn], medians[fn]) | |
38 | return means, medians | |
39 | ||
40 | if __name__ == '__main__': | |
41 | freqs = np.arange(minfreq, maxfreq, 1.) | |
42 | modes = ["yin", "yinfft"] | |
43 | for mode in modes: | |
44 | means, medians = get_stats_for_pitch_method(mode, freqs) | |
45 | plt.figure() | |
46 | plt.plot(freqs, means, 'g-') | |
47 | plt.plot(freqs, medians, 'r--') | |
48 | #plt.savefig(mode + '_deviations_test.png', dpi=300) | |
49 | plt.show() |
45 | 45 | #define AUBIO_NPY_SMPL_CHR "f" |
46 | 46 | #endif |
47 | 47 | |
48 | #ifndef PATH_MAX | |
49 | #ifdef MAX_PATH | |
50 | #define PATH_MAX MAX_PATH | |
51 | #else | |
52 | #define PATH_MAX 1024 | |
53 | #endif | |
54 | #endif | |
55 | ||
48 | 56 | // compat with Python < 2.6 |
49 | 57 | #ifndef Py_TYPE |
50 | 58 | #define Py_TYPE(ob) (((PyObject*)(ob))->ob_type) |
79 | 79 | return NULL; |
80 | 80 | } |
81 | 81 | |
82 | self->uri = "none"; | |
82 | self->uri = NULL; | |
83 | 83 | if (uri != NULL) { |
84 | self->uri = uri; | |
84 | self->uri = (char_t *)malloc(sizeof(char_t) * (strnlen(uri, PATH_MAX) + 1)); | |
85 | strncpy(self->uri, uri, strnlen(uri, PATH_MAX) + 1); | |
85 | 86 | } |
86 | 87 | |
87 | 88 | self->samplerate = Py_aubio_default_samplerate; |
125 | 126 | { |
126 | 127 | del_aubio_sink(self->o); |
127 | 128 | free(self->mwrite_data.data); |
129 | if (self->uri) { | |
130 | free(self->uri); | |
131 | } | |
128 | 132 | Py_TYPE(self)->tp_free((PyObject *) self); |
129 | 133 | } |
130 | 134 | |
198 | 202 | Py_RETURN_NONE; |
199 | 203 | } |
200 | 204 | |
205 | static char Pyaubio_sink_enter_doc[] = ""; | |
206 | static PyObject* Pyaubio_sink_enter(Py_sink *self, PyObject *unused) { | |
207 | Py_INCREF(self); | |
208 | return (PyObject*)self; | |
209 | } | |
210 | ||
211 | static char Pyaubio_sink_exit_doc[] = ""; | |
212 | static PyObject* Pyaubio_sink_exit(Py_sink *self, PyObject *unused) { | |
213 | return Pyaubio_sink_close(self, unused); | |
214 | } | |
215 | ||
201 | 216 | static PyMethodDef Py_sink_methods[] = { |
202 | 217 | {"do", (PyCFunction) Py_sink_do, METH_VARARGS, Py_sink_do_doc}, |
203 | 218 | {"do_multi", (PyCFunction) Py_sink_do_multi, METH_VARARGS, Py_sink_do_multi_doc}, |
204 | 219 | {"close", (PyCFunction) Pyaubio_sink_close, METH_NOARGS, Py_sink_close_doc}, |
220 | {"__enter__", (PyCFunction)Pyaubio_sink_enter, METH_NOARGS, | |
221 | Pyaubio_sink_enter_doc}, | |
222 | {"__exit__", (PyCFunction)Pyaubio_sink_exit, METH_VARARGS, | |
223 | Pyaubio_sink_exit_doc}, | |
205 | 224 | {NULL} /* sentinel */ |
206 | 225 | }; |
207 | 226 |
99 | 99 | return NULL; |
100 | 100 | } |
101 | 101 | |
102 | self->uri = "none"; | |
102 | self->uri = NULL; | |
103 | 103 | if (uri != NULL) { |
104 | self->uri = uri; | |
104 | self->uri = (char_t *)malloc(sizeof(char_t) * (strnlen(uri, PATH_MAX) + 1)); | |
105 | strncpy(self->uri, uri, strnlen(uri, PATH_MAX) + 1); | |
105 | 106 | } |
106 | 107 | |
107 | 108 | self->samplerate = 0; |
161 | 162 | if (self->o) { |
162 | 163 | del_aubio_source(self->o); |
163 | 164 | free(self->c_mread_to.data); |
165 | } | |
166 | if (self->uri) { | |
167 | free(self->uri); | |
164 | 168 | } |
165 | 169 | Py_XDECREF(self->read_to); |
166 | 170 | Py_XDECREF(self->mread_to); |
241 | 245 | static PyObject * |
242 | 246 | Pyaubio_source_close (Py_source *self, PyObject *unused) |
243 | 247 | { |
244 | aubio_source_close (self->o); | |
248 | if (aubio_source_close(self->o) != 0) return NULL; | |
245 | 249 | Py_RETURN_NONE; |
246 | 250 | } |
247 | 251 | |
269 | 273 | return NULL; |
270 | 274 | } |
271 | 275 | Py_RETURN_NONE; |
276 | } | |
277 | ||
278 | static char Pyaubio_source_enter_doc[] = ""; | |
279 | static PyObject* Pyaubio_source_enter(Py_source *self, PyObject *unused) { | |
280 | Py_INCREF(self); | |
281 | return (PyObject*)self; | |
282 | } | |
283 | ||
284 | static char Pyaubio_source_exit_doc[] = ""; | |
285 | static PyObject* Pyaubio_source_exit(Py_source *self, PyObject *unused) { | |
286 | return Pyaubio_source_close(self, unused); | |
287 | } | |
288 | ||
289 | static PyObject* Pyaubio_source_iter(PyObject *self) { | |
290 | Py_INCREF(self); | |
291 | return (PyObject*)self; | |
292 | } | |
293 | ||
294 | static PyObject* Pyaubio_source_iter_next(Py_source *self) { | |
295 | PyObject *done, *size; | |
296 | if (self->channels == 1) { | |
297 | done = Py_source_do(self, NULL); | |
298 | } else { | |
299 | done = Py_source_do_multi(self, NULL); | |
300 | } | |
301 | if (!PyTuple_Check(done)) { | |
302 | PyErr_Format(PyExc_ValueError, | |
303 | "error when reading source: not opened?"); | |
304 | return NULL; | |
305 | } | |
306 | size = PyTuple_GetItem(done, 1); | |
307 | if (size != NULL && PyLong_Check(size)) { | |
308 | if (PyLong_AsLong(size) == (long)self->hop_size) { | |
309 | PyObject *vec = PyTuple_GetItem(done, 0); | |
310 | return vec; | |
311 | } else if (PyLong_AsLong(size) > 0) { | |
312 | // short read, return a shorter array | |
313 | PyArrayObject *shortread = (PyArrayObject*)PyTuple_GetItem(done, 0); | |
314 | PyArray_Dims newdims; | |
315 | PyObject *reshaped; | |
316 | newdims.len = PyArray_NDIM(shortread); | |
317 | newdims.ptr = PyArray_DIMS(shortread); | |
318 | // mono or multiple channels? | |
319 | if (newdims.len == 1) { | |
320 | newdims.ptr[0] = PyLong_AsLong(size); | |
321 | } else { | |
322 | newdims.ptr[1] = PyLong_AsLong(size); | |
323 | } | |
324 | reshaped = PyArray_Newshape(shortread, &newdims, NPY_CORDER); | |
325 | Py_DECREF(shortread); | |
326 | return reshaped; | |
327 | } else { | |
328 | PyErr_SetNone(PyExc_StopIteration); | |
329 | return NULL; | |
330 | } | |
331 | } else { | |
332 | PyErr_SetNone(PyExc_StopIteration); | |
333 | return NULL; | |
334 | } | |
272 | 335 | } |
273 | 336 | |
274 | 337 | static PyMethodDef Py_source_methods[] = { |
284 | 347 | METH_NOARGS, Py_source_close_doc}, |
285 | 348 | {"seek", (PyCFunction) Pyaubio_source_seek, |
286 | 349 | METH_VARARGS, Py_source_seek_doc}, |
350 | {"__enter__", (PyCFunction)Pyaubio_source_enter, METH_NOARGS, | |
351 | Pyaubio_source_enter_doc}, | |
352 | {"__exit__", (PyCFunction)Pyaubio_source_exit, METH_VARARGS, | |
353 | Pyaubio_source_exit_doc}, | |
287 | 354 | {NULL} /* sentinel */ |
288 | 355 | }; |
289 | 356 | |
313 | 380 | 0, |
314 | 381 | 0, |
315 | 382 | 0, |
316 | 0, | |
317 | 0, | |
383 | Pyaubio_source_iter, | |
384 | (unaryfunc)Pyaubio_source_iter_next, | |
318 | 385 | Py_source_methods, |
319 | 386 | Py_source_members, |
320 | 387 | 0, |
0 | #! /usr/bin/env python | |
1 | # -*- coding: utf-8 -*- | |
2 | ||
3 | """aubio command line tool | |
4 | ||
5 | This file was written by Paul Brossier <piem@aubio.org> and is released under | |
6 | the GNU/GPL v3. | |
7 | ||
8 | Note: this script is mostly about parsing command line arguments. For more | |
9 | readable code examples, check out the `python/demos` folder.""" | |
10 | ||
11 | import sys | |
12 | import argparse | |
13 | import aubio | |
14 | ||
15 | def aubio_parser(): | |
16 | epilog = 'use "%(prog)s <command> --help" for more info about each command' | |
17 | parser = argparse.ArgumentParser(epilog=epilog) | |
18 | parser.add_argument('-V', '--version', help="show version", | |
19 | action="store_true", dest="show_version") | |
20 | ||
21 | subparsers = parser.add_subparsers(title='commands', dest='command', | |
22 | metavar="") | |
23 | ||
24 | # onset subcommand | |
25 | subparser = subparsers.add_parser('onset', | |
26 | help='estimate time of onsets (beginning of sound event)', | |
27 | formatter_class = argparse.ArgumentDefaultsHelpFormatter) | |
28 | parser_add_input(subparser) | |
29 | parser_add_buf_hop_size(subparser) | |
30 | helpstr = "onset novelty function" | |
31 | helpstr += " <default|energy|hfc|complex|phase|specdiff|kl|mkl|specflux>" | |
32 | parser_add_method(subparser, helpstr=helpstr) | |
33 | parser_add_threshold(subparser) | |
34 | parser_add_silence(subparser) | |
35 | parser_add_minioi(subparser) | |
36 | parser_add_time_format(subparser) | |
37 | parser_add_verbose_help(subparser) | |
38 | subparser.set_defaults(process=process_onset) | |
39 | ||
40 | # pitch subcommand | |
41 | subparser = subparsers.add_parser('pitch', | |
42 | help='estimate fundamental frequency (monophonic)') | |
43 | parser_add_input(subparser) | |
44 | parser_add_buf_hop_size(subparser, buf_size=2048) | |
45 | helpstr = "pitch detection method <default|yinfft|yin|mcomb|fcomb|schmitt>" | |
46 | parser_add_method(subparser, helpstr=helpstr) | |
47 | parser_add_threshold(subparser) | |
48 | parser_add_silence(subparser) | |
49 | parser_add_time_format(subparser) | |
50 | parser_add_verbose_help(subparser) | |
51 | subparser.set_defaults(process=process_pitch) | |
52 | ||
53 | # beat subcommand | |
54 | subparser = subparsers.add_parser('beat', | |
55 | help='estimate location of beats') | |
56 | parser_add_input(subparser) | |
57 | parser_add_buf_hop_size(subparser, buf_size=1024, hop_size=512) | |
58 | parser_add_time_format(subparser) | |
59 | parser_add_verbose_help(subparser) | |
60 | subparser.set_defaults(process=process_beat) | |
61 | ||
62 | # tempo subcommand | |
63 | subparser = subparsers.add_parser('tempo', | |
64 | help='estimate overall tempo in bpm') | |
65 | parser_add_input(subparser) | |
66 | parser_add_buf_hop_size(subparser, buf_size=1024, hop_size=512) | |
67 | parser_add_time_format(subparser) | |
68 | parser_add_verbose_help(subparser) | |
69 | subparser.set_defaults(process=process_tempo) | |
70 | ||
71 | # notes subcommand | |
72 | subparser = subparsers.add_parser('notes', | |
73 | help='estimate midi-like notes (monophonic)') | |
74 | parser_add_input(subparser) | |
75 | parser_add_buf_hop_size(subparser) | |
76 | parser_add_time_format(subparser) | |
77 | parser_add_verbose_help(subparser) | |
78 | subparser.set_defaults(process=process_notes) | |
79 | ||
80 | # mfcc subcommand | |
81 | subparser = subparsers.add_parser('mfcc', | |
82 | help='extract Mel-Frequency Cepstrum Coefficients') | |
83 | parser_add_input(subparser) | |
84 | parser_add_buf_hop_size(subparser) | |
85 | parser_add_time_format(subparser) | |
86 | parser_add_verbose_help(subparser) | |
87 | subparser.set_defaults(process=process_mfcc) | |
88 | ||
89 | # melbands subcommand | |
90 | subparser = subparsers.add_parser('melbands', | |
91 | help='extract energies in Mel-frequency bands') | |
92 | parser_add_input(subparser) | |
93 | parser_add_buf_hop_size(subparser) | |
94 | parser_add_time_format(subparser) | |
95 | parser_add_verbose_help(subparser) | |
96 | subparser.set_defaults(process=process_melbands) | |
97 | ||
98 | return parser | |
99 | ||
100 | def parser_add_input(parser): | |
101 | parser.add_argument("source_uri", default=None, nargs='?', | |
102 | help="input sound file to analyse", metavar = "<source_uri>") | |
103 | parser.add_argument("-i", "--input", dest = "source_uri2", | |
104 | help="input sound file to analyse", metavar = "<source_uri>") | |
105 | parser.add_argument("-r", "--samplerate", | |
106 | metavar = "<freq>", type=int, | |
107 | action="store", dest="samplerate", default=0, | |
108 | help="samplerate at which the file should be represented") | |
109 | ||
110 | def parser_add_verbose_help(parser): | |
111 | parser.add_argument("-v","--verbose", | |
112 | action="count", dest="verbose", default=1, | |
113 | help="make lots of noise [default]") | |
114 | parser.add_argument("-q","--quiet", | |
115 | action="store_const", dest="verbose", const=0, | |
116 | help="be quiet") | |
117 | ||
118 | def parser_add_buf_hop_size(parser, buf_size=512, hop_size=256): | |
119 | parser.add_argument("-B","--bufsize", | |
120 | action="store", dest="buf_size", default=buf_size, | |
121 | metavar = "<size>", type=int, | |
122 | help="buffer size [default=%d]" % buf_size) | |
123 | parser.add_argument("-H","--hopsize", | |
124 | metavar = "<size>", type=int, | |
125 | action="store", dest="hop_size", default=hop_size, | |
126 | help="overlap size [default=%d]" % hop_size) | |
127 | ||
128 | def parser_add_method(parser, method='default', helpstr='method'): | |
129 | parser.add_argument("-m","--method", | |
130 | metavar = "<method>", type=str, | |
131 | action="store", dest="method", default=method, | |
132 | help="%s [default=%s]" % (helpstr, method)) | |
133 | ||
134 | def parser_add_threshold(parser, default=None): | |
135 | parser.add_argument("-t","--threshold", | |
136 | metavar = "<threshold>", type=float, | |
137 | action="store", dest="threshold", default=default, | |
138 | help="threshold [default=%s]" % default) | |
139 | ||
140 | def parser_add_silence(parser): | |
141 | parser.add_argument("-s", "--silence", | |
142 | metavar = "<value>", type=float, | |
143 | action="store", dest="silence", default=-70, | |
144 | help="silence threshold") | |
145 | ||
146 | def parser_add_minioi(parser): | |
147 | parser.add_argument("-M", "--minioi", | |
148 | metavar = "<value>", type=str, | |
149 | action="store", dest="minioi", default="12ms", | |
150 | help="minimum Inter-Onset Interval") | |
151 | ||
152 | def parser_add_time_format(parser): | |
153 | helpstr = "select time values output format (samples, ms, seconds)" | |
154 | helpstr += " [default=seconds]" | |
155 | parser.add_argument("-T", "--time-format", | |
156 | metavar='format', | |
157 | dest="time_format", | |
158 | default=None, | |
159 | help=helpstr) | |
160 | ||
161 | # some utilities | |
162 | ||
163 | def samples2seconds(n_frames, samplerate): | |
164 | return "%f\t" % (n_frames / float(samplerate)) | |
165 | ||
166 | def samples2milliseconds(n_frames, samplerate): | |
167 | return "%f\t" % (1000. * n_frames / float(samplerate)) | |
168 | ||
169 | def samples2samples(n_frames, samplerate): | |
170 | return "%d\t" % n_frames | |
171 | ||
172 | def timefunc(mode): | |
173 | if mode is None or mode == 'seconds' or mode == 's': | |
174 | return samples2seconds | |
175 | elif mode == 'ms' or mode == 'milliseconds': | |
176 | return samples2milliseconds | |
177 | elif mode == 'samples': | |
178 | return samples2samples | |
179 | else: | |
180 | raise ValueError('invalid time format %s' % mode) | |
181 | ||
182 | # definition of processing classes | |
183 | ||
184 | class default_process(object): | |
185 | def __init__(self, args): | |
186 | if 'time_format' in args: | |
187 | self.time2string = timefunc(args.time_format) | |
188 | if args.verbose > 2 and hasattr(self, 'options'): | |
189 | name = type(self).__name__.split('_')[1] | |
190 | optstr = ' '.join(['running', name, 'with options', repr(self.options), '\n']) | |
191 | sys.stderr.write(optstr) | |
192 | def flush(self, n_frames, samplerate): | |
193 | # optionally called at the end of process | |
194 | pass | |
195 | ||
196 | def parse_options(self, args, valid_opts): | |
197 | # get any valid options found in a dictionnary of arguments | |
198 | options = {k :v for k,v in vars(args).items() if k in valid_opts} | |
199 | self.options = options | |
200 | ||
201 | def remap_pvoc_options(self, options): | |
202 | # FIXME: we need to remap buf_size to win_s, hop_size to hop_s | |
203 | # adjust python/ext/py-phasevoc.c to understand buf_size/hop_size | |
204 | if 'buf_size' in options: | |
205 | options['win_s'] = options['buf_size'] | |
206 | del options['buf_size'] | |
207 | if 'hop_size' in options: | |
208 | options['hop_s'] = options['hop_size'] | |
209 | del options['hop_size'] | |
210 | self.options = options | |
211 | ||
212 | class process_onset(default_process): | |
213 | valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] | |
214 | def __init__(self, args): | |
215 | self.parse_options(args, self.valid_opts) | |
216 | self.onset = aubio.onset(**self.options) | |
217 | if args.threshold is not None: | |
218 | self.onset.set_threshold(args.threshold) | |
219 | if args.minioi: | |
220 | if args.minioi.endswith('ms'): | |
221 | self.onset.set_minioi_ms(float(args.minioi[:-2])) | |
222 | elif args.minioi.endswith('s'): | |
223 | self.onset.set_minioi_s(float(args.minioi[:-1])) | |
224 | else: | |
225 | self.onset.set_minioi(int(args.minioi)) | |
226 | if args.silence: | |
227 | self.onset.set_silence(args.silence) | |
228 | super(process_onset, self).__init__(args) | |
229 | def __call__(self, block): | |
230 | return self.onset(block) | |
231 | def repr_res(self, res, frames_read, samplerate): | |
232 | if res[0] != 0: | |
233 | outstr = self.time2string(self.onset.get_last(), samplerate) | |
234 | sys.stdout.write(outstr + '\n') | |
235 | ||
236 | class process_pitch(default_process): | |
237 | valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] | |
238 | def __init__(self, args): | |
239 | self.parse_options(args, self.valid_opts) | |
240 | self.pitch = aubio.pitch(**self.options) | |
241 | if args.threshold is not None: | |
242 | self.pitch.set_tolerance(args.threshold) | |
243 | if args.silence is not None: | |
244 | self.pitch.set_silence(args.silence) | |
245 | super(process_pitch, self).__init__(args) | |
246 | def __call__(self, block): | |
247 | return self.pitch(block) | |
248 | def repr_res(self, res, frames_read, samplerate): | |
249 | fmt_out = self.time2string(frames_read, samplerate) | |
250 | sys.stdout.write(fmt_out + "%.6f\n" % res[0]) | |
251 | ||
252 | class process_beat(default_process): | |
253 | valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] | |
254 | def __init__(self, args): | |
255 | self.parse_options(args, self.valid_opts) | |
256 | self.tempo = aubio.tempo(**self.options) | |
257 | super(process_beat, self).__init__(args) | |
258 | def __call__(self, block): | |
259 | return self.tempo(block) | |
260 | def repr_res(self, res, frames_read, samplerate): | |
261 | if res[0] != 0: | |
262 | outstr = self.time2string(self.tempo.get_last(), samplerate) | |
263 | sys.stdout.write(outstr + '\n') | |
264 | ||
265 | class process_tempo(process_beat): | |
266 | def __init__(self, args): | |
267 | super(process_tempo, self).__init__(args) | |
268 | self.beat_locations = [] | |
269 | def repr_res(self, res, frames_read, samplerate): | |
270 | if res[0] != 0: | |
271 | self.beat_locations.append(self.tempo.get_last_s()) | |
272 | def flush(self, frames_read, samplerate): | |
273 | import numpy as np | |
274 | if len(self.beat_locations) < 2: | |
275 | outstr = "unknown bpm" | |
276 | else: | |
277 | bpms = 60./ np.diff(self.beat_locations) | |
278 | median_bpm = np.mean(bpms) | |
279 | if len(self.beat_locations) < 10: | |
280 | outstr = "%.2f bpm (uncertain)" % median_bpm | |
281 | else: | |
282 | outstr = "%.2f bpm" % median_bpm | |
283 | sys.stdout.write(outstr + '\n') | |
284 | ||
285 | class process_notes(default_process): | |
286 | valid_opts = ['method', 'hop_size', 'buf_size', 'samplerate'] | |
287 | def __init__(self, args): | |
288 | self.parse_options(args, self.valid_opts) | |
289 | self.notes = aubio.notes(**self.options) | |
290 | super(process_notes, self).__init__(args) | |
291 | def __call__(self, block): | |
292 | return self.notes(block) | |
293 | def repr_res(self, res, frames_read, samplerate): | |
294 | if res[2] != 0: # note off | |
295 | fmt_out = self.time2string(frames_read, samplerate) | |
296 | sys.stdout.write(fmt_out + '\n') | |
297 | if res[0] != 0: # note on | |
298 | lastmidi = res[0] | |
299 | fmt_out = "%f\t" % lastmidi | |
300 | fmt_out += self.time2string(frames_read, samplerate) | |
301 | sys.stdout.write(fmt_out) # + '\t') | |
302 | def flush(self, frames_read, samplerate): | |
303 | eof = self.time2string(frames_read, samplerate) | |
304 | sys.stdout.write(eof + '\n') | |
305 | ||
306 | class process_mfcc(default_process): | |
307 | def __init__(self, args): | |
308 | valid_opts1 = ['hop_size', 'buf_size'] | |
309 | self.parse_options(args, valid_opts1) | |
310 | self.remap_pvoc_options(self.options) | |
311 | self.pv = aubio.pvoc(**self.options) | |
312 | ||
313 | valid_opts2 = ['buf_size', 'n_filters', 'n_coeffs', 'samplerate'] | |
314 | self.parse_options(args, valid_opts2) | |
315 | self.mfcc = aubio.mfcc(**self.options) | |
316 | ||
317 | # remember all options | |
318 | self.parse_options(args, list(set(valid_opts1 + valid_opts2))) | |
319 | ||
320 | super(process_mfcc, self).__init__(args) | |
321 | ||
322 | def __call__(self, block): | |
323 | fftgrain = self.pv(block) | |
324 | return self.mfcc(fftgrain) | |
325 | def repr_res(self, res, frames_read, samplerate): | |
326 | fmt_out = self.time2string(frames_read, samplerate) | |
327 | fmt_out += ' '.join(["% 9.7f" % f for f in res.tolist()]) | |
328 | sys.stdout.write(fmt_out + '\n') | |
329 | ||
330 | class process_melbands(default_process): | |
331 | def __init__(self, args): | |
332 | self.args = args | |
333 | valid_opts = ['hop_size', 'buf_size'] | |
334 | self.parse_options(args, valid_opts) | |
335 | self.remap_pvoc_options(self.options) | |
336 | self.pv = aubio.pvoc(**self.options) | |
337 | ||
338 | valid_opts = ['buf_size', 'n_filters'] | |
339 | self.parse_options(args, valid_opts) | |
340 | self.remap_pvoc_options(self.options) | |
341 | self.filterbank = aubio.filterbank(**self.options) | |
342 | self.filterbank.set_mel_coeffs_slaney(args.samplerate) | |
343 | ||
344 | super(process_melbands, self).__init__(args) | |
345 | def __call__(self, block): | |
346 | fftgrain = self.pv(block) | |
347 | return self.filterbank(fftgrain) | |
348 | def repr_res(self, res, frames_read, samplerate): | |
349 | fmt_out = self.time2string(frames_read, samplerate) | |
350 | fmt_out += ' '.join(["% 9.7f" % f for f in res.tolist()]) | |
351 | sys.stdout.write(fmt_out + '\n') | |
352 | ||
353 | def main(): | |
354 | parser = aubio_parser() | |
355 | args = parser.parse_args() | |
356 | if 'show_version' in args and args.show_version: | |
357 | sys.stdout.write('aubio version ' + aubio.version + '\n') | |
358 | sys.exit(0) | |
359 | elif 'verbose' in args and args.verbose > 3: | |
360 | sys.stderr.write('aubio version ' + aubio.version + '\n') | |
361 | if 'command' not in args or args.command is None: | |
362 | # no command given, print help and return 1 | |
363 | parser.print_help() | |
364 | sys.exit(1) | |
365 | elif not args.source_uri and not args.source_uri2: | |
366 | sys.stderr.write("Error: a source is required\n") | |
367 | parser.print_help() | |
368 | sys.exit(1) | |
369 | elif args.source_uri2 is not None: | |
370 | args.source_uri = args.source_uri2 | |
371 | try: | |
372 | # open source_uri | |
373 | with aubio.source(args.source_uri, hop_size=args.hop_size, | |
374 | samplerate=args.samplerate) as a_source: | |
375 | # always update args.samplerate to native samplerate, in case | |
376 | # source was opened with args.samplerate=0 | |
377 | args.samplerate = a_source.samplerate | |
378 | # create the processor for this subcommand | |
379 | processor = args.process(args) | |
380 | frames_read = 0 | |
381 | while True: | |
382 | # read new block from source | |
383 | block, read = a_source() | |
384 | # execute processor on this block | |
385 | res = processor(block) | |
386 | # print results for this block | |
387 | if args.verbose > 0: | |
388 | processor.repr_res(res, frames_read, a_source.samplerate) | |
389 | # increment total number of frames read | |
390 | frames_read += read | |
391 | # exit loop at end of file | |
392 | if read < a_source.hop_size: break | |
393 | # flush the processor if needed | |
394 | processor.flush(frames_read, a_source.samplerate) | |
395 | if args.verbose > 1: | |
396 | fmt_string = "read {:.2f}s" | |
397 | fmt_string += " ({:d} samples in {:d} blocks of {:d})" | |
398 | fmt_string += " from {:s} at {:d}Hz\n" | |
399 | sys.stderr.write(fmt_string.format( | |
400 | frames_read/float(a_source.samplerate), | |
401 | frames_read, | |
402 | frames_read // a_source.hop_size + 1, | |
403 | a_source.hop_size, | |
404 | a_source.uri, | |
405 | a_source.samplerate)) | |
406 | except KeyboardInterrupt: | |
407 | sys.exit(1) |
0 | #! /usr/bin/env python | |
1 | ||
2 | """ this file was written by Paul Brossier | |
3 | it is released under the GNU/GPL license. | |
4 | """ | |
5 | ||
6 | import sys | |
7 | ||
8 | def parse_args(): | |
9 | from optparse import OptionParser | |
10 | usage = "usage: %s [options] -i soundfile" % sys.argv[0] | |
11 | usage += "\n help: %s -h" % sys.argv[0] | |
12 | parser = OptionParser(usage=usage) | |
13 | parser.add_option("-i", "--input", action = "store", dest = "source_file", | |
14 | help="input sound file to analyse", metavar = "<source_file>") | |
15 | parser.add_option("-O","--onset-method", | |
16 | action="store", dest="onset_method", default='default', | |
17 | metavar = "<onset_method>", | |
18 | help="onset detection method [default=default] \ | |
19 | complexdomain|hfc|phase|specdiff|energy|kl|mkl") | |
20 | # cutting methods | |
21 | parser.add_option("-b","--beat", | |
22 | action="store_true", dest="beat", default=False, | |
23 | help="use beat locations") | |
24 | """ | |
25 | parser.add_option("-S","--silencecut", | |
26 | action="store_true", dest="silencecut", default=False, | |
27 | help="use silence locations") | |
28 | parser.add_option("-s","--silence", | |
29 | metavar = "<value>", | |
30 | action="store", dest="silence", default=-70, | |
31 | help="silence threshold [default=-70]") | |
32 | """ | |
33 | # algorithm parameters | |
34 | parser.add_option("-r", "--samplerate", | |
35 | metavar = "<freq>", type='int', | |
36 | action="store", dest="samplerate", default=0, | |
37 | help="samplerate at which the file should be represented") | |
38 | parser.add_option("-B","--bufsize", | |
39 | action="store", dest="bufsize", default=512, | |
40 | metavar = "<size>", type='int', | |
41 | help="buffer size [default=512]") | |
42 | parser.add_option("-H","--hopsize", | |
43 | metavar = "<size>", type='int', | |
44 | action="store", dest="hopsize", default=256, | |
45 | help="overlap size [default=256]") | |
46 | parser.add_option("-t","--onset-threshold", | |
47 | metavar = "<value>", type="float", | |
48 | action="store", dest="threshold", default=0.3, | |
49 | help="onset peak picking threshold [default=0.3]") | |
50 | parser.add_option("-c","--cut", | |
51 | action="store_true", dest="cut", default=False, | |
52 | help="cut input sound file at detected labels \ | |
53 | best used with option -L") | |
54 | ||
55 | # minioi | |
56 | parser.add_option("-M","--minioi", | |
57 | metavar = "<value>", type='string', | |
58 | action="store", dest="minioi", default="12ms", | |
59 | help="minimum inter onset interval [default=12ms]") | |
60 | ||
61 | """ | |
62 | parser.add_option("-D","--delay", | |
63 | action = "store", dest = "delay", type = "float", | |
64 | metavar = "<seconds>", default=0, | |
65 | help="number of seconds to take back [default=system]\ | |
66 | default system delay is 3*hopsize/samplerate") | |
67 | parser.add_option("-C","--dcthreshold", | |
68 | metavar = "<value>", | |
69 | action="store", dest="dcthreshold", default=1., | |
70 | help="onset peak picking DC component [default=1.]") | |
71 | parser.add_option("-L","--localmin", | |
72 | action="store_true", dest="localmin", default=False, | |
73 | help="use local minima after peak detection") | |
74 | parser.add_option("-d","--derivate", | |
75 | action="store_true", dest="derivate", default=False, | |
76 | help="derivate onset detection function") | |
77 | parser.add_option("-z","--zerocross", | |
78 | metavar = "<value>", | |
79 | action="store", dest="zerothres", default=0.008, | |
80 | help="zero-crossing threshold for slicing [default=0.00008]") | |
81 | """ | |
82 | # plotting functions | |
83 | """ | |
84 | parser.add_option("-p","--plot", | |
85 | action="store_true", dest="plot", default=False, | |
86 | help="draw plot") | |
87 | parser.add_option("-x","--xsize", | |
88 | metavar = "<size>", | |
89 | action="store", dest="xsize", default=1., | |
90 | type='float', help="define xsize for plot") | |
91 | parser.add_option("-y","--ysize", | |
92 | metavar = "<size>", | |
93 | action="store", dest="ysize", default=1., | |
94 | type='float', help="define ysize for plot") | |
95 | parser.add_option("-f","--function", | |
96 | action="store_true", dest="func", default=False, | |
97 | help="print detection function") | |
98 | parser.add_option("-n","--no-onsets", | |
99 | action="store_true", dest="nplot", default=False, | |
100 | help="do not plot detected onsets") | |
101 | parser.add_option("-O","--outplot", | |
102 | metavar = "<output_image>", | |
103 | action="store", dest="outplot", default=None, | |
104 | help="save plot to output.{ps,png}") | |
105 | parser.add_option("-F","--spectrogram", | |
106 | action="store_true", dest="spectro", default=False, | |
107 | help="add spectrogram to the plot") | |
108 | """ | |
109 | parser.add_option("-o","--output", type = str, | |
110 | metavar = "<outputdir>", | |
111 | action="store", dest="output_directory", default=None, | |
112 | help="specify path where slices of the original file should be created") | |
113 | parser.add_option("--cut-until-nsamples", type = int, | |
114 | metavar = "<samples>", | |
115 | action = "store", dest = "cut_until_nsamples", default = None, | |
116 | help="how many extra samples should be added at the end of each slice") | |
117 | parser.add_option("--cut-every-nslices", type = int, | |
118 | metavar = "<samples>", | |
119 | action = "store", dest = "cut_every_nslices", default = None, | |
120 | help="how many slices should be groupped together at each cut") | |
121 | parser.add_option("--cut-until-nslices", type = int, | |
122 | metavar = "<slices>", | |
123 | action = "store", dest = "cut_until_nslices", default = None, | |
124 | help="how many extra slices should be added at the end of each slice") | |
125 | ||
126 | parser.add_option("-v","--verbose", | |
127 | action="store_true", dest="verbose", default=True, | |
128 | help="make lots of noise [default]") | |
129 | parser.add_option("-q","--quiet", | |
130 | action="store_false", dest="verbose", default=True, | |
131 | help="be quiet") | |
132 | (options, args) = parser.parse_args() | |
133 | if not options.source_file: | |
134 | if len(args) == 1: | |
135 | options.source_file = args[0] | |
136 | else: | |
137 | print ("no file name given\n" + usage) | |
138 | sys.exit(1) | |
139 | return options, args | |
140 | ||
141 | def main(): | |
142 | options, args = parse_args() | |
143 | ||
144 | hopsize = options.hopsize | |
145 | bufsize = options.bufsize | |
146 | samplerate = options.samplerate | |
147 | source_file = options.source_file | |
148 | ||
149 | from aubio import onset, tempo, source | |
150 | ||
151 | s = source(source_file, samplerate, hopsize) | |
152 | if samplerate == 0: samplerate = s.get_samplerate() | |
153 | ||
154 | if options.beat: | |
155 | o = tempo(options.onset_method, bufsize, hopsize) | |
156 | else: | |
157 | o = onset(options.onset_method, bufsize, hopsize) | |
158 | if options.minioi: | |
159 | if options.minioi.endswith('ms'): | |
160 | o.set_minioi_ms(int(options.minioi[:-2])) | |
161 | elif options.minioi.endswith('s'): | |
162 | o.set_minioi_s(int(options.minioi[:-1])) | |
163 | else: | |
164 | o.set_minioi(int(options.minioi)) | |
165 | o.set_threshold(options.threshold) | |
166 | ||
167 | timestamps = [] | |
168 | total_frames = 0 | |
169 | # analyze pass | |
170 | while True: | |
171 | samples, read = s() | |
172 | if o(samples): | |
173 | timestamps.append (o.get_last()) | |
174 | if options.verbose: print ("%.4f" % o.get_last_s()) | |
175 | total_frames += read | |
176 | if read < hopsize: break | |
177 | del s | |
178 | # print some info | |
179 | nstamps = len(timestamps) | |
180 | duration = float (total_frames) / float(samplerate) | |
181 | info = 'found %(nstamps)d timestamps in %(source_file)s' % locals() | |
182 | info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % locals() | |
183 | sys.stderr.write(info) | |
184 | ||
185 | # cutting pass | |
186 | if options.cut and nstamps > 0: | |
187 | # generate output files | |
188 | from aubio.slicing import slice_source_at_stamps | |
189 | timestamps_end = None | |
190 | if options.cut_every_nslices: | |
191 | timestamps = timestamps[::options.cut_every_nslices] | |
192 | nstamps = len(timestamps) | |
193 | if options.cut_until_nslices and options.cut_until_nsamples: | |
194 | print ("warning: using cut_until_nslices, but cut_until_nsamples is set") | |
195 | if options.cut_until_nsamples: | |
196 | timestamps_end = [t + options.cut_until_nsamples for t in timestamps[1:]] | |
197 | timestamps_end += [ 1e120 ] | |
198 | if options.cut_until_nslices: | |
199 | timestamps_end = [t for t in timestamps[1 + options.cut_until_nslices:]] | |
200 | timestamps_end += [ 1e120 ] * (options.cut_until_nslices + 1) | |
201 | slice_source_at_stamps(source_file, timestamps, timestamps_end = timestamps_end, | |
202 | output_dir = options.output_directory, | |
203 | samplerate = samplerate) | |
204 | ||
205 | # print some info | |
206 | duration = float (total_frames) / float(samplerate) | |
207 | info = 'created %(nstamps)d slices from %(source_file)s' % locals() | |
208 | info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % locals() | |
209 | sys.stderr.write(info) |
182 | 182 | |
183 | 183 | def gen_code(self): |
184 | 184 | out = "" |
185 | out += self.gen_struct() | |
186 | out += self.gen_doc() | |
187 | out += self.gen_new() | |
188 | out += self.gen_init() | |
189 | out += self.gen_del() | |
190 | out += self.gen_do() | |
191 | out += self.gen_memberdef() | |
192 | out += self.gen_set() | |
193 | out += self.gen_get() | |
194 | out += self.gen_methodef() | |
195 | out += self.gen_typeobject() | |
185 | try: | |
186 | out += self.gen_struct() | |
187 | out += self.gen_doc() | |
188 | out += self.gen_new() | |
189 | out += self.gen_init() | |
190 | out += self.gen_del() | |
191 | out += self.gen_do() | |
192 | out += self.gen_memberdef() | |
193 | out += self.gen_set() | |
194 | out += self.gen_get() | |
195 | out += self.gen_methodef() | |
196 | out += self.gen_typeobject() | |
197 | except Exception as e: | |
198 | print ("Failed generating code for", self.shortname) | |
199 | raise | |
196 | 200 | return out |
197 | 201 | |
198 | 202 | def gen_struct(self): |
38 | 38 | 'source_wavread', |
39 | 39 | #'sampler', |
40 | 40 | 'audio_unit', |
41 | 'spectral_whitening', | |
41 | 42 | ] |
42 | 43 | |
43 | 44 | def get_preprocessor(): |
2 | 2 | import sys, os, glob, subprocess |
3 | 3 | import distutils, distutils.command.clean, distutils.dir_util |
4 | 4 | from .gen_external import generate_external, header, output_path |
5 | ||
6 | from this_version import get_aubio_version | |
5 | 7 | |
6 | 8 | # inspired from https://gist.github.com/abergmeier/9488990 |
7 | 9 | def add_packages(packages, ext=None, **kw): |
52 | 54 | ext.library_dirs += [os.path.join('build', 'src')] |
53 | 55 | ext.libraries += ['aubio'] |
54 | 56 | |
55 | def add_local_aubio_sources(ext, usedouble = False): | |
57 | def add_local_aubio_sources(ext): | |
56 | 58 | """ build aubio inside python module instead of linking against libaubio """ |
57 | 59 | print("Info: libaubio was not installed or built locally with waf, adding src/") |
58 | 60 | aubio_sources = sorted(glob.glob(os.path.join('src', '**.c'))) |
60 | 62 | ext.sources += aubio_sources |
61 | 63 | |
62 | 64 | def add_local_macros(ext, usedouble = False): |
65 | if usedouble: | |
66 | ext.define_macros += [('HAVE_AUBIO_DOUBLE', 1)] | |
63 | 67 | # define macros (waf puts them in build/src/config.h) |
64 | 68 | for define_macro in ['HAVE_STDLIB_H', 'HAVE_STDIO_H', |
65 | 69 | 'HAVE_MATH_H', 'HAVE_STRING_H', |
71 | 75 | def add_external_deps(ext, usedouble = False): |
72 | 76 | # loof for additional packages |
73 | 77 | print("Info: looking for *optional* additional packages") |
74 | packages = ['libavcodec', 'libavformat', 'libavutil', 'libavresample', | |
75 | 'jack', | |
76 | 'jack', | |
78 | packages = ['libavcodec', 'libavformat', 'libavutil', | |
79 | 'libswresample', 'libavresample', | |
77 | 80 | 'sndfile', |
78 | 81 | #'fftw3f', |
79 | 82 | ] |
80 | 83 | # samplerate only works with float |
81 | if usedouble == False: | |
84 | if usedouble is False: | |
82 | 85 | packages += ['samplerate'] |
83 | 86 | else: |
84 | 87 | print("Info: not adding libsamplerate in double precision mode") |
85 | 88 | add_packages(packages, ext=ext) |
86 | 89 | if 'avcodec' in ext.libraries \ |
87 | 90 | and 'avformat' in ext.libraries \ |
88 | and 'avutil' in ext.libraries \ | |
89 | and 'avresample' in ext.libraries: | |
90 | ext.define_macros += [('HAVE_LIBAV', 1)] | |
91 | if 'jack' in ext.libraries: | |
92 | ext.define_macros += [('HAVE_JACK', 1)] | |
91 | and 'avutil' in ext.libraries: | |
92 | if 'swresample' in ext.libraries: | |
93 | ext.define_macros += [('HAVE_SWRESAMPLE', 1)] | |
94 | elif 'avresample' in ext.libraries: | |
95 | ext.define_macros += [('HAVE_AVRESAMPLE', 1)] | |
96 | if 'swresample' in ext.libraries or 'avresample' in ext.libraries: | |
97 | ext.define_macros += [('HAVE_LIBAV', 1)] | |
93 | 98 | if 'sndfile' in ext.libraries: |
94 | 99 | ext.define_macros += [('HAVE_SNDFILE', 1)] |
95 | 100 | if 'samplerate' in ext.libraries: |
110 | 115 | |
111 | 116 | ext.define_macros += [('HAVE_WAVWRITE', 1)] |
112 | 117 | ext.define_macros += [('HAVE_WAVREAD', 1)] |
113 | # TODO: | |
114 | # add cblas | |
118 | ||
119 | # TODO: add cblas | |
115 | 120 | if 0: |
116 | 121 | ext.libraries += ['cblas'] |
117 | 122 | ext.define_macros += [('HAVE_ATLAS_CBLAS_H', 1)] |
118 | 123 | |
119 | 124 | def add_system_aubio(ext): |
120 | 125 | # use pkg-config to find aubio's location |
121 | add_packages(['aubio'], ext) | |
126 | aubio_version = get_aubio_version() | |
127 | add_packages(['aubio = ' + aubio_version], ext) | |
122 | 128 | if 'aubio' not in ext.libraries: |
123 | print("Error: libaubio not found") | |
129 | print("Info: aubio " + aubio_version + " was not found by pkg-config") | |
130 | else: | |
131 | print("Info: using system aubio " + aubio_version + " found in " + ' '.join(ext.library_dirs)) | |
132 | ||
133 | def add_libav_on_win(ext): | |
134 | """ no pkg-config on windows, simply assume these libs are available """ | |
135 | ext.libraries += ['avformat', 'avutil', 'avcodec', 'swresample'] | |
136 | for define_macro in ['HAVE_LIBAV', 'HAVE_SWRESAMPLE']: | |
137 | ext.define_macros += [(define_macro, 1)] | |
124 | 138 | |
125 | 139 | class CleanGenerated(distutils.command.clean.clean): |
126 | 140 | def run(self): |
148 | 162 | |
149 | 163 | def build_extension(self, extension): |
150 | 164 | if self.enable_double or 'HAVE_AUBIO_DOUBLE' in os.environ: |
151 | extension.define_macros += [('HAVE_AUBIO_DOUBLE', 1)] | |
152 | 165 | enable_double = True |
153 | 166 | else: |
154 | 167 | enable_double = False |
159 | 172 | # use local src/aubio.h |
160 | 173 | if os.path.isfile(os.path.join('src', 'aubio.h')): |
161 | 174 | add_local_aubio_header(extension) |
162 | add_local_macros(extension) | |
175 | add_local_macros(extension, usedouble=enable_double) | |
163 | 176 | # look for a local waf build |
164 | 177 | if os.path.isfile(os.path.join('build','src', 'fvec.c.1.o')): |
165 | 178 | add_local_aubio_lib(extension) |
166 | 179 | else: |
167 | 180 | # check for external dependencies |
168 | 181 | add_external_deps(extension, usedouble=enable_double) |
182 | # force adding libav on windows | |
183 | if os.name == 'nt' and ('WITH_LIBAV' in os.environ \ | |
184 | or 'CONDA_PREFIX' in os.environ): | |
185 | add_libav_on_win(extension) | |
169 | 186 | # add libaubio sources and look for optional deps with pkg-config |
170 | add_local_aubio_sources(extension, usedouble=enable_double) | |
187 | add_local_aubio_sources(extension) | |
171 | 188 | # generate files python/gen/*.c, python/gen/aubio-generated.h |
189 | extension.include_dirs += [ output_path ] | |
172 | 190 | extension.sources += generate_external(header, output_path, overwrite = False, |
173 | 191 | usedouble=enable_double) |
174 | 192 | return _build_ext.build_extension(self, extension) |
0 | #! /usr/bin/env python | |
1 | ||
2 | """ this file was written by Paul Brossier | |
3 | it is released under the GNU/GPL license. | |
4 | """ | |
5 | ||
6 | import sys | |
7 | #from aubio.task import * | |
8 | ||
9 | usage = "usage: %s [options] -i soundfile" % sys.argv[0] | |
10 | usage += "\n help: %s -h" % sys.argv[0] | |
11 | ||
12 | def parse_args(): | |
13 | from optparse import OptionParser | |
14 | parser = OptionParser(usage=usage) | |
15 | parser.add_option("-i", "--input", action = "store", dest = "source_file", | |
16 | help="input sound file to analyse", metavar = "<source_file>") | |
17 | parser.add_option("-O","--onset-method", | |
18 | action="store", dest="onset_method", default='default', | |
19 | metavar = "<onset_method>", | |
20 | help="onset detection method [default=default] \ | |
21 | complexdomain|hfc|phase|specdiff|energy|kl|mkl") | |
22 | # cutting methods | |
23 | parser.add_option("-b","--beat", | |
24 | action="store_true", dest="beat", default=False, | |
25 | help="use beat locations") | |
26 | """ | |
27 | parser.add_option("-S","--silencecut", | |
28 | action="store_true", dest="silencecut", default=False, | |
29 | help="use silence locations") | |
30 | parser.add_option("-s","--silence", | |
31 | metavar = "<value>", | |
32 | action="store", dest="silence", default=-70, | |
33 | help="silence threshold [default=-70]") | |
34 | """ | |
35 | # algorithm parameters | |
36 | parser.add_option("-r", "--samplerate", | |
37 | metavar = "<freq>", type='int', | |
38 | action="store", dest="samplerate", default=0, | |
39 | help="samplerate at which the file should be represented") | |
40 | parser.add_option("-B","--bufsize", | |
41 | action="store", dest="bufsize", default=512, | |
42 | metavar = "<size>", type='int', | |
43 | help="buffer size [default=512]") | |
44 | parser.add_option("-H","--hopsize", | |
45 | metavar = "<size>", type='int', | |
46 | action="store", dest="hopsize", default=256, | |
47 | help="overlap size [default=256]") | |
48 | parser.add_option("-t","--onset-threshold", | |
49 | metavar = "<value>", type="float", | |
50 | action="store", dest="threshold", default=0.3, | |
51 | help="onset peak picking threshold [default=0.3]") | |
52 | parser.add_option("-c","--cut", | |
53 | action="store_true", dest="cut", default=False, | |
54 | help="cut input sound file at detected labels \ | |
55 | best used with option -L") | |
56 | ||
57 | # minioi | |
58 | parser.add_option("-M","--minioi", | |
59 | metavar = "<value>", type='string', | |
60 | action="store", dest="minioi", default="12ms", | |
61 | help="minimum inter onset interval [default=12ms]") | |
62 | ||
63 | """ | |
64 | parser.add_option("-D","--delay", | |
65 | action = "store", dest = "delay", type = "float", | |
66 | metavar = "<seconds>", default=0, | |
67 | help="number of seconds to take back [default=system]\ | |
68 | default system delay is 3*hopsize/samplerate") | |
69 | parser.add_option("-C","--dcthreshold", | |
70 | metavar = "<value>", | |
71 | action="store", dest="dcthreshold", default=1., | |
72 | help="onset peak picking DC component [default=1.]") | |
73 | parser.add_option("-L","--localmin", | |
74 | action="store_true", dest="localmin", default=False, | |
75 | help="use local minima after peak detection") | |
76 | parser.add_option("-d","--derivate", | |
77 | action="store_true", dest="derivate", default=False, | |
78 | help="derivate onset detection function") | |
79 | parser.add_option("-z","--zerocross", | |
80 | metavar = "<value>", | |
81 | action="store", dest="zerothres", default=0.008, | |
82 | help="zero-crossing threshold for slicing [default=0.00008]") | |
83 | """ | |
84 | # plotting functions | |
85 | """ | |
86 | parser.add_option("-p","--plot", | |
87 | action="store_true", dest="plot", default=False, | |
88 | help="draw plot") | |
89 | parser.add_option("-x","--xsize", | |
90 | metavar = "<size>", | |
91 | action="store", dest="xsize", default=1., | |
92 | type='float', help="define xsize for plot") | |
93 | parser.add_option("-y","--ysize", | |
94 | metavar = "<size>", | |
95 | action="store", dest="ysize", default=1., | |
96 | type='float', help="define ysize for plot") | |
97 | parser.add_option("-f","--function", | |
98 | action="store_true", dest="func", default=False, | |
99 | help="print detection function") | |
100 | parser.add_option("-n","--no-onsets", | |
101 | action="store_true", dest="nplot", default=False, | |
102 | help="do not plot detected onsets") | |
103 | parser.add_option("-O","--outplot", | |
104 | metavar = "<output_image>", | |
105 | action="store", dest="outplot", default=None, | |
106 | help="save plot to output.{ps,png}") | |
107 | parser.add_option("-F","--spectrogram", | |
108 | action="store_true", dest="spectro", default=False, | |
109 | help="add spectrogram to the plot") | |
110 | """ | |
111 | parser.add_option("-o","--output", type = str, | |
112 | metavar = "<outputdir>", | |
113 | action="store", dest="output_directory", default=None, | |
114 | help="specify path where slices of the original file should be created") | |
115 | parser.add_option("--cut-until-nsamples", type = int, | |
116 | metavar = "<samples>", | |
117 | action = "store", dest = "cut_until_nsamples", default = None, | |
118 | help="how many extra samples should be added at the end of each slice") | |
119 | parser.add_option("--cut-until-nslices", type = int, | |
120 | metavar = "<slices>", | |
121 | action = "store", dest = "cut_until_nslices", default = None, | |
122 | help="how many extra slices should be added at the end of each slice") | |
123 | ||
124 | parser.add_option("-v","--verbose", | |
125 | action="store_true", dest="verbose", default=True, | |
126 | help="make lots of noise [default]") | |
127 | parser.add_option("-q","--quiet", | |
128 | action="store_false", dest="verbose", default=True, | |
129 | help="be quiet") | |
130 | (options, args) = parser.parse_args() | |
131 | if not options.source_file: | |
132 | import os.path | |
133 | if len(args) == 1: | |
134 | options.source_file = args[0] | |
135 | else: | |
136 | print ("no file name given\n" + usage) | |
137 | sys.exit(1) | |
138 | return options, args | |
139 | ||
140 | if __name__ == '__main__': | |
141 | options, args = parse_args() | |
142 | ||
143 | hopsize = options.hopsize | |
144 | bufsize = options.bufsize | |
145 | samplerate = options.samplerate | |
146 | source_file = options.source_file | |
147 | ||
148 | from aubio import onset, tempo, source, sink | |
149 | ||
150 | s = source(source_file, samplerate, hopsize) | |
151 | if samplerate == 0: samplerate = s.get_samplerate() | |
152 | ||
153 | if options.beat: | |
154 | o = tempo(options.onset_method, bufsize, hopsize) | |
155 | else: | |
156 | o = onset(options.onset_method, bufsize, hopsize) | |
157 | if options.minioi: | |
158 | if options.minioi.endswith('ms'): | |
159 | o.set_minioi_ms(int(options.minioi[:-2])) | |
160 | elif options.minioi.endswith('s'): | |
161 | o.set_minioi_s(int(options.minioi[:-1])) | |
162 | else: | |
163 | o.set_minioi(int(options.minioi)) | |
164 | o.set_threshold(options.threshold) | |
165 | ||
166 | timestamps = [] | |
167 | total_frames = 0 | |
168 | # analyze pass | |
169 | while True: | |
170 | samples, read = s() | |
171 | if o(samples): | |
172 | timestamps.append (o.get_last()) | |
173 | if options.verbose: print ("%.4f" % o.get_last_s()) | |
174 | total_frames += read | |
175 | if read < hopsize: break | |
176 | del s | |
177 | # print some info | |
178 | nstamps = len(timestamps) | |
179 | duration = float (total_frames) / float(samplerate) | |
180 | info = 'found %(nstamps)d timestamps in %(source_file)s' % locals() | |
181 | info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % locals() | |
182 | sys.stderr.write(info) | |
183 | ||
184 | # cutting pass | |
185 | if options.cut and nstamps > 0: | |
186 | # generate output files | |
187 | from aubio.slicing import slice_source_at_stamps | |
188 | timestamps_end = None | |
189 | if options.cut_until_nslices and options.cut_until_nsamples: | |
190 | print ("warning: using cut_until_nslices, but cut_until_nsamples is set") | |
191 | if options.cut_until_nsamples: | |
192 | timestamps_end = [t + options.cut_until_nsamples for t in timestamps[1:]] | |
193 | timestamps_end += [ 1e120 ] | |
194 | if options.cut_until_nslices: | |
195 | timestamps_end = [t for t in timestamps[1 + options.cut_until_nslices:]] | |
196 | timestamps_end += [ 1e120 ] * (options.cut_until_nslices + 1) | |
197 | slice_source_at_stamps(source_file, timestamps, timestamps_end = timestamps_end, | |
198 | output_dir = options.output_directory, | |
199 | samplerate = samplerate) | |
200 | ||
201 | # print some info | |
202 | duration = float (total_frames) / float(samplerate) | |
203 | info = 'created %(nstamps)d slices from %(source_file)s' % locals() | |
204 | info += ' (total %(duration).2fs at %(samplerate)dHz)\n' % locals() | |
205 | sys.stderr.write(info) |
18 | 18 | self.o = onset(samplerate = self.samplerate) |
19 | 19 | |
20 | 20 | def test_get_delay(self): |
21 | assert_equal (self.o.get_delay(), int(4.3 * self.o.hop_size)) | |
21 | self.assertGreater(self.o.get_delay(), 0) | |
22 | 22 | |
23 | 23 | def test_get_delay_s(self): |
24 | assert_almost_equal (self.o.get_delay_s(), self.o.get_delay() / float(self.samplerate)) | |
24 | self.assertGreater(self.o.get_delay_s(), 0.) | |
25 | 25 | |
26 | 26 | def test_get_delay_ms(self): |
27 | assert_almost_equal (self.o.get_delay_ms(), self.o.get_delay() * 1000. / self.samplerate, 5) | |
27 | self.assertGreater(self.o.get_delay_ms(), 0.) | |
28 | 28 | |
29 | 29 | def test_get_minioi(self): |
30 | assert_almost_equal (self.o.get_minioi(), 0.02 * self.samplerate) | |
30 | self.assertGreater(self.o.get_minioi(), 0) | |
31 | 31 | |
32 | 32 | def test_get_minioi_s(self): |
33 | assert_almost_equal (self.o.get_minioi_s(), 0.02) | |
33 | self.assertGreater(self.o.get_minioi_s(), 0.) | |
34 | 34 | |
35 | 35 | def test_get_minioi_ms(self): |
36 | assert_equal (self.o.get_minioi_ms(), 20.) | |
36 | self.assertGreater(self.o.get_minioi_ms(), 0.) | |
37 | 37 | |
38 | 38 | def test_get_threshold(self): |
39 | assert_almost_equal (self.o.get_threshold(), 0.3) | |
39 | self.assertGreater(self.o.get_threshold(), 0.) | |
40 | 40 | |
41 | 41 | def test_set_delay(self): |
42 | 42 | val = 256 |
115 | 115 | g.close() |
116 | 116 | del_tmp_sink_path(sink_path) |
117 | 117 | |
118 | def test_read_with(self): | |
119 | samplerate = 44100 | |
120 | sink_path = get_tmp_sink_path() | |
121 | vec = fvec(128) | |
122 | with sink(sink_path, samplerate) as g: | |
123 | for _ in range(10): | |
124 | g(vec, 128) | |
125 | ||
118 | 126 | if __name__ == '__main__': |
119 | 127 | main() |
4 | 4 | from numpy.testing import TestCase, assert_equal |
5 | 5 | from aubio import source |
6 | 6 | from .utils import list_all_sounds |
7 | import numpy as np | |
8 | 7 | |
9 | 8 | import warnings |
10 | 9 | warnings.filterwarnings('ignore', category=UserWarning, append=True) |
168 | 167 | #print (result_str.format(*result_params)) |
169 | 168 | return total_frames |
170 | 169 | |
170 | class aubio_source_with(aubio_source_test_case_base): | |
171 | ||
172 | #@params(*list_of_sounds) | |
173 | @params(*list_of_sounds) | |
174 | def test_read_from_mono(self, filename): | |
175 | total_frames = 0 | |
176 | hop_size = 2048 | |
177 | with source(filename, 0, hop_size) as input_source: | |
178 | assert_equal(input_source.hop_size, hop_size) | |
179 | #assert_equal(input_source.samplerate, samplerate) | |
180 | total_frames = 0 | |
181 | for frames in input_source: | |
182 | total_frames += frames.shape[-1] | |
183 | # check we read as many samples as we expected | |
184 | assert_equal(total_frames, input_source.duration) | |
185 | ||
171 | 186 | if __name__ == '__main__': |
172 | 187 | main() |
0 | 0 | #! /bin/sh |
1 | ||
2 | # cd to aubio directory for consistency | |
3 | cd `dirname $0`/.. | |
1 | 4 | |
2 | 5 | AUBIO_TMPDIR=`mktemp -d /var/tmp/aubio-build-XXXX` |
3 | 6 | PACKAGE=aubio |
0 | 0 | #! /bin/bash |
1 | 1 | |
2 | # This script cross compiles aubio for windows using mingw, both for 32 and 64 | |
3 | # bits. Built binaries will be placed in ./dist-win32 and ./dist-win64. | |
4 | ||
2 | # This script cross compiles aubio for windows using mingw, four times: | |
3 | # | |
4 | # - 32 and 64 bits with no external dependencies | |
5 | # - 32 and 64 bits against ffmpeg | |
6 | # | |
5 | 7 | # On debian or ubuntu, you will want to 'apt-get install gcc-mingw-w64' |
6 | 8 | |
7 | 9 | set -e |
8 | 10 | set -x |
9 | 11 | |
10 | WAFOPTS="-v --disable-avcodec --disable-samplerate --disable-jack --disable-sndfile" | |
12 | python this_version.py -v | |
13 | VERSION=`python $PWD/this_version.py -v` | |
11 | 14 | |
12 | [ -d dist-win32 ] && rm -rf dist-win32 | |
13 | [ -d dist-win64 ] && rm -rf dist-win64 | |
15 | FFMPEG_BUILDS_URL="https://ffmpeg.zeranoe.com/builds" | |
16 | FFMPEG_DEFAULT="20170404-1229007" | |
14 | 17 | |
15 | CFLAGS="-Os" \ | |
16 | LDFLAGS="" \ | |
17 | CC=x86_64-w64-mingw32-gcc \ | |
18 | ./waf distclean configure build install --destdir=$PWD/dist-win64 \ | |
19 | --testcmd="echo %s" \ | |
20 | $WAFOPTS --with-target-platform=win64 | |
18 | # define some default CFLAGS | |
19 | DEF_CFLAGS="-Os -I/usr/share/mingw-w64" | |
20 | DEF_LDFLAGS="" | |
21 | 21 | |
22 | CFLAGS="-Os" \ | |
23 | LDFLAGS="" \ | |
24 | CC=i686-w64-mingw32-gcc \ | |
25 | ./waf distclean configure build install --destdir=$PWD/dist-win32 \ | |
26 | --testcmd="echo %s" \ | |
27 | $WAFOPTS --with-target-platform=win32 | |
22 | WAFOPTS="" | |
23 | # disable external deps to make sure we don't try to use the host package | |
24 | WAFOPTS+=" --disable-samplerate --disable-jack --disable-sndfile" | |
25 | # enable ffmpeg build | |
26 | WAFOPTS+=" --disable-avcodec" | |
27 | # install without a prefix | |
28 | WAFOPTS+=" --prefix=/" | |
29 | # compile the tests, but fake running them | |
30 | # passing this option WAFOPTS fails (escaping?), added in actual waf call below | |
31 | #WAFOPTS+=" --testcmd='echo %s'" | |
32 | ||
33 | # debugging | |
34 | #WAFOPTS+=" -v" | |
35 | #WAFOPTS+=" -j1" | |
36 | #WAFOPTS+=" --notests" | |
37 | ||
38 | function fetch_ffpmeg() { | |
39 | ## manually add ffmpeg (no pkg-config .pc files in bins) | |
40 | [ -n "$FFMPEG_VERSION" ] || FFMPEG_VERSION=$FFMPEG_DEFAULT | |
41 | FFMPEG_TARBALL="$PWD/ffmpeg-$FFMPEG_VERSION-$TARGET-dev.zip" | |
42 | FFMPEG_BINARIES="${FFMPEG_TARBALL%%.zip}" | |
43 | if [ ! -d "$FFMPEG_BINARIES" ] | |
44 | then | |
45 | if [ ! -f "$FFMPEG_TARBALL" ] | |
46 | then | |
47 | curl -O $FFMPEG_BUILDS_URL/$TARGET/dev/ffmpeg-$FFMPEG_VERSION-$TARGET-dev.zip | |
48 | else | |
49 | echo "using $FFMPEG_TARBALL" | |
50 | fi | |
51 | unzip -x $FFMPEG_TARBALL | |
52 | else | |
53 | echo "using $FFMPEG_BINARIES" | |
54 | fi | |
55 | } | |
56 | ||
57 | function get_cflags() { | |
58 | CFLAGS="$DEF_CFLAGS" | |
59 | LDFLAGS="$DEF_LDFLAGS" | |
60 | if [ -n "$WITH_FFMEG" ] | |
61 | then | |
62 | fetch_ffpmeg | |
63 | CFLAGS+=" -DHAVE_LIBAV=1 -DHAVE_SWRESAMPLE=1" | |
64 | CFLAGS+=" -I$FFMPEG_BINARIES/include" | |
65 | LDFLAGS+=" -lavcodec -lavformat -lavutil -lswresample" | |
66 | LDFLAGS+=" -L$FFMPEG_BINARIES/lib" | |
67 | fi | |
68 | } | |
69 | ||
70 | function build_mingw() { | |
71 | DESTDIR="$PWD/aubio-$VERSION-$TARGET" | |
72 | [ -n "$WITH_FFMEG" ] && DESTDIR+="-ffmpeg" | |
73 | [ -f $DESTDIR.zip ] && echo "Remove existing $DESTDIR.zip first" && exit 1 | |
74 | [ -d $DESTDIR ] && rm -rf $DESTDIR | |
75 | WAFOPTS_TGT="$WAFOPTS --destdir=$DESTDIR" | |
76 | WAFOPTS_TGT+=" --with-target-platform=$TARGET" | |
77 | get_cflags | |
78 | CFLAGS="$CFLAGS" LDFLAGS="$LDFLAGS" \ | |
79 | ./waf distclean configure build install $WAFOPTS_TGT --testcmd='echo %s' | |
80 | # fix dll location (see https://github.com/waf-project/waf/issues/1860) | |
81 | mv $DESTDIR/lib/libaubio-5.dll $DESTDIR/bin | |
82 | # generate def file (see https://github.com/aubio/aubio/issues/97) | |
83 | ( echo -e "EXPORTS"; $NM $DESTDIR/bin/libaubio-5.dll | grep T\ | \ | |
84 | egrep "(aubio|fvec|cvec|lvec|fmat)" | sed 's/^.* T _\?//' ) \ | |
85 | > $DESTDIR/bin/libaubio-5.def | |
86 | zip -r $DESTDIR.zip `basename $DESTDIR` | |
87 | rm -rf $DESTDIR | |
88 | sha256sum $DESTDIR.zip > $DESTDIR.zip.sha256 | |
89 | } | |
90 | ||
91 | function build_mingw32() { | |
92 | TARGET=win32 | |
93 | export CC=i686-w64-mingw32-gcc | |
94 | export NM=i686-w64-mingw32-nm | |
95 | build_mingw | |
96 | } | |
97 | ||
98 | function build_mingw64() { | |
99 | TARGET=win64 | |
100 | export CC=x86_64-w64-mingw32-gcc | |
101 | export NM=x86_64-w64-mingw32-nm | |
102 | build_mingw | |
103 | } | |
104 | ||
105 | # fetch waf if needed | |
106 | [ -f "waf" ] || make getwaf | |
107 | ||
108 | # first build without ffmpeg | |
109 | build_mingw32 | |
110 | build_mingw64 | |
111 | ||
112 | # then build against ffmpeg | |
113 | WITH_FFMEG=1 | |
114 | build_mingw32 | |
115 | build_mingw64 | |
116 | ||
117 | set +x | |
118 | echo "" | |
119 | echo "All done! The following files were generated:" | |
120 | echo "" | |
121 | ls -lart aubio*.zip* |
2 | 2 | set -e |
3 | 3 | set -x |
4 | 4 | |
5 | WAFURL=https://waf.io/waf-1.8.22 | |
5 | WAFURL=https://waf.io/waf-1.9.6 | |
6 | 6 | |
7 | 7 | ( which wget > /dev/null && wget -qO waf $WAFURL ) || ( which curl > /dev/null && curl $WAFURL > waf ) |
8 | 8 |
1 | 1 | |
2 | 2 | import sys, os.path, glob |
3 | 3 | from setuptools import setup, Extension |
4 | from python.lib.moresetuptools import * | |
4 | from python.lib.moresetuptools import build_ext, CleanGenerated | |
5 | 5 | # function to generate gen/*.{c,h} |
6 | from python.lib.gen_external import generate_external, header, output_path | |
6 | from this_version import get_aubio_version, get_aubio_pyversion | |
7 | 7 | |
8 | # read from VERSION | |
9 | for l in open('VERSION').readlines(): exec (l.strip()) | |
10 | ||
11 | if AUBIO_MAJOR_VERSION is None or AUBIO_MINOR_VERSION is None \ | |
12 | or AUBIO_PATCH_VERSION is None: | |
13 | raise SystemError("Failed parsing VERSION file.") | |
14 | ||
15 | __version__ = '.'.join(map(str, [AUBIO_MAJOR_VERSION, | |
16 | AUBIO_MINOR_VERSION, | |
17 | AUBIO_PATCH_VERSION])) | |
18 | if AUBIO_VERSION_STATUS is not None: | |
19 | if AUBIO_VERSION_STATUS.startswith('~'): | |
20 | AUBIO_VERSION_STATUS = AUBIO_VERSION_STATUS[1:] | |
21 | #__version__ += AUBIO_VERSION_STATUS | |
8 | __version__ = get_aubio_pyversion() | |
9 | __aubio_version__ = get_aubio_version() | |
22 | 10 | |
23 | 11 | include_dirs = [] |
24 | 12 | library_dirs = [] |
25 | define_macros = [('AUBIO_VERSION', '%s' % __version__)] | |
13 | define_macros = [('AUBIO_VERSION', '%s' % __aubio_version__)] | |
26 | 14 | extra_link_args = [] |
27 | 15 | |
28 | 16 | include_dirs += [ 'python/ext' ] |
29 | include_dirs += [ output_path ] # aubio-generated.h | |
30 | 17 | try: |
31 | 18 | import numpy |
32 | 19 | include_dirs += [ numpy.get_include() ] |
69 | 56 | version = __version__, |
70 | 57 | packages = ['aubio'], |
71 | 58 | package_dir = {'aubio':'python/lib/aubio'}, |
72 | scripts = ['python/scripts/aubiocut'], | |
73 | 59 | ext_modules = [aubio_extension], |
74 | 60 | description = 'a collection of tools for music analysis', |
75 | 61 | long_description = 'a collection of tools for music analysis', |
78 | 64 | author_email = 'piem@aubio.org', |
79 | 65 | maintainer = 'Paul Brossier', |
80 | 66 | maintainer_email = 'piem@aubio.org', |
81 | url = 'http://aubio.org/', | |
67 | url = 'https://aubio.org/', | |
82 | 68 | platforms = 'any', |
83 | 69 | classifiers = classifiers, |
84 | 70 | install_requires = ['numpy'], |
71 | setup_requires = ['numpy'], | |
85 | 72 | cmdclass = { |
86 | 73 | 'clean': CleanGenerated, |
87 | 74 | 'build_ext': build_ext, |
88 | 75 | }, |
76 | entry_points = { | |
77 | 'console_scripts': [ | |
78 | 'aubio = aubio.cmd:main', | |
79 | 'aubiocut = aubio.cut:main', | |
80 | ], | |
81 | }, | |
89 | 82 | test_suite = 'nose2.collector.collector', |
83 | extras_require = { | |
84 | 'tests': ['numpy'], | |
85 | }, | |
90 | 86 | ) |
186 | 186 | #include "spectral/filterbank_mel.h" |
187 | 187 | #include "spectral/mfcc.h" |
188 | 188 | #include "spectral/specdesc.h" |
189 | #include "spectral/awhitening.h" | |
189 | 190 | #include "spectral/tss.h" |
190 | 191 | #include "pitch/pitch.h" |
191 | 192 | #include "onset/onset.h" |
138 | 138 | cvec_norm_zeros(s); |
139 | 139 | cvec_phas_zeros(s); |
140 | 140 | } |
141 | ||
142 | void cvec_logmag(cvec_t *s, smpl_t lambda) { | |
143 | uint_t j; | |
144 | for (j=0; j< s->length; j++) { | |
145 | s->norm[j] = LOG(lambda * s->norm[j] + 1); | |
146 | } | |
147 | } |
229 | 229 | */ |
230 | 230 | void cvec_zeros(cvec_t *s); |
231 | 231 | |
232 | /** take logarithmic magnitude | |
233 | ||
234 | \param s input cvec to compress | |
235 | \param lambda value to use for normalisation | |
236 | ||
237 | \f$ S_k = log( \lambda * S_k + 1 ) \f$ | |
238 | ||
239 | */ | |
240 | void cvec_logmag(cvec_t *s, smpl_t lambda); | |
241 | ||
232 | 242 | #ifdef __cplusplus |
233 | 243 | } |
234 | 244 | #endif |
96 | 96 | return s; |
97 | 97 | } |
98 | 98 | #endif /* HAVE_WAVWRITE */ |
99 | //AUBIO_ERROR("sink: failed creating '%s' with samplerate %dHz\n", | |
100 | // uri, samplerate); | |
99 | #if !defined(HAVE_WAVWRITE) && \ | |
100 | !defined(HAVE_SNDFILE) && \ | |
101 | !defined(HAVE_SINK_APPLE_AUDIO) | |
102 | AUBIO_ERROR("sink: failed creating '%s' at %dHz (no sink built-in)\n", uri, samplerate); | |
103 | #endif | |
101 | 104 | AUBIO_FREE(s); |
102 | 105 | return NULL; |
103 | 106 | } |
112 | 112 | s->s_del = (del_aubio_source_t)(del_aubio_source_wavread); |
113 | 113 | return s; |
114 | 114 | } |
115 | #else /* failover message */ | |
116 | #if !(defined(HAVE_LIBAV) || defined(HAVE_SOURCE_APPLE_AUDIO) || defined(HAVE_SNDFILE)) | |
117 | AUBIO_ERROR("source: failed creating aubio source with %s" | |
118 | " at samplerate %d with hop_size %d\n", uri, samplerate, hop_size); | |
119 | #endif /* failover */ | |
120 | 115 | #endif /* HAVE_WAVREAD */ |
116 | #if !defined(HAVE_WAVREAD) && \ | |
117 | !defined(HAVE_LIBAV) && \ | |
118 | !defined(HAVE_SOURCE_APPLE_AUDIO) && \ | |
119 | !defined(HAVE_SNDFILE) | |
120 | AUBIO_ERROR("source: failed creating with %s at %dHz with hop size %d" | |
121 | " (no source built-in)\n", uri, samplerate, hop_size); | |
122 | #endif | |
121 | 123 | AUBIO_FREE(s); |
122 | 124 | return NULL; |
123 | 125 | } |
278 | 278 | uint_t aubio_source_apple_audio_close (aubio_source_apple_audio_t *s) |
279 | 279 | { |
280 | 280 | OSStatus err = noErr; |
281 | if (!s->audioFile) { return AUBIO_FAIL; } | |
281 | if (!s->audioFile) { return AUBIO_OK; } | |
282 | 282 | err = ExtAudioFileDispose(s->audioFile); |
283 | 283 | s->audioFile = NULL; |
284 | 284 | if (err) { |
23 | 23 | |
24 | 24 | #include <libavcodec/avcodec.h> |
25 | 25 | #include <libavformat/avformat.h> |
26 | #if defined(HAVE_SWRESAMPLE) | |
27 | #include <libswresample/swresample.h> | |
28 | #elif defined(HAVE_AVRESAMPLE) | |
26 | 29 | #include <libavresample/avresample.h> |
30 | #endif | |
27 | 31 | #include <libavutil/opt.h> |
28 | 32 | #include <stdlib.h> |
29 | 33 | |
37 | 41 | ) |
38 | 42 | |
39 | 43 | // backward compatibility with libavcodec55 |
44 | #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(57,0,0) | |
45 | #define HAVE_AUBIO_LIBAVCODEC_DEPRECATED 1 | |
46 | #endif | |
47 | ||
40 | 48 | #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(55,28,1) |
41 | #warning "libavcodec55 is deprecated" | |
42 | #define HAVE_AUBIO_LIBAVCODEC_DEPRECATED 1 | |
49 | #warning "libavcodec < 56 is deprecated" | |
43 | 50 | #define av_frame_alloc avcodec_alloc_frame |
44 | 51 | #define av_frame_free avcodec_free_frame |
45 | 52 | #define av_packet_unref av_free_packet |
67 | 74 | AVCodecContext *avCodecCtx; |
68 | 75 | AVFrame *avFrame; |
69 | 76 | AVPacket avPacket; |
77 | #ifdef HAVE_AVRESAMPLE | |
70 | 78 | AVAudioResampleContext *avr; |
79 | #elif defined(HAVE_SWRESAMPLE) | |
80 | SwrContext *avr; | |
81 | #endif | |
71 | 82 | smpl_t *output; |
72 | 83 | uint_t read_samples; |
73 | 84 | uint_t read_index; |
275 | 286 | int64_t input_layout = av_get_default_channel_layout(s->input_channels); |
276 | 287 | uint_t output_channels = multi ? s->input_channels : 1; |
277 | 288 | int64_t output_layout = av_get_default_channel_layout(output_channels); |
289 | #ifdef HAVE_AVRESAMPLE | |
278 | 290 | AVAudioResampleContext *avr = avresample_alloc_context(); |
279 | 291 | AVAudioResampleContext *oldavr = s->avr; |
292 | #elif defined(HAVE_SWRESAMPLE) | |
293 | SwrContext *avr = swr_alloc(); | |
294 | SwrContext *oldavr = s->avr; | |
295 | #endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ | |
280 | 296 | |
281 | 297 | av_opt_set_int(avr, "in_channel_layout", input_layout, 0); |
282 | 298 | av_opt_set_int(avr, "out_channel_layout", output_layout, 0); |
291 | 307 | // TODO: use planar? |
292 | 308 | //av_opt_set_int(avr, "out_sample_fmt", AV_SAMPLE_FMT_FLTP, 0); |
293 | 309 | int err; |
310 | #ifdef HAVE_AVRESAMPLE | |
294 | 311 | if ( ( err = avresample_open(avr) ) < 0) { |
312 | #elif defined(HAVE_SWRESAMPLE) | |
313 | if ( ( err = swr_init(avr) ) < 0) { | |
314 | #endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ | |
295 | 315 | char errorstr[256]; |
296 | 316 | av_strerror (err, errorstr, sizeof(errorstr)); |
297 | 317 | AUBIO_ERR("source_avcodec: Could not open AVAudioResampleContext for %s (%s)\n", |
301 | 321 | } |
302 | 322 | s->avr = avr; |
303 | 323 | if (oldavr != NULL) { |
324 | #ifdef HAVE_AVRESAMPLE | |
304 | 325 | avresample_close( oldavr ); |
326 | #elif defined(HAVE_SWRESAMPLE) | |
327 | swr_close ( oldavr ); | |
328 | #endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ | |
305 | 329 | av_free ( oldavr ); |
306 | 330 | oldavr = NULL; |
307 | 331 | } |
315 | 339 | AVFrame *avFrame = s->avFrame; |
316 | 340 | AVPacket avPacket = s->avPacket; |
317 | 341 | av_init_packet (&avPacket); |
342 | #ifdef HAVE_AVRESAMPLE | |
318 | 343 | AVAudioResampleContext *avr = s->avr; |
344 | #elif defined(HAVE_SWRESAMPLE) | |
345 | SwrContext *avr = s->avr; | |
346 | #endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ | |
319 | 347 | smpl_t *output = s->output; |
320 | 348 | *read_samples = 0; |
321 | 349 | |
330 | 358 | char errorstr[256]; |
331 | 359 | av_strerror (err, errorstr, sizeof(errorstr)); |
332 | 360 | AUBIO_ERR("source_avcodec: could not read frame in %s (%s)\n", s->path, errorstr); |
361 | s->eof = 1; | |
333 | 362 | goto beach; |
334 | 363 | } |
335 | 364 | } while (avPacket.stream_index != s->selected_stream); |
347 | 376 | } |
348 | 377 | if (ret < 0) { |
349 | 378 | if (ret == AVERROR(EAGAIN)) { |
350 | AUBIO_WRN("source_avcodec: output is not available right now - user must try to send new input\n"); | |
379 | //AUBIO_WRN("source_avcodec: output is not available right now - user must try to send new input\n"); | |
380 | goto beach; | |
351 | 381 | } else if (ret == AVERROR_EOF) { |
352 | 382 | AUBIO_WRN("source_avcodec: the decoder has been fully flushed, and there will be no more output frames\n"); |
353 | 383 | } else { |
368 | 398 | goto beach; |
369 | 399 | } |
370 | 400 | |
401 | #ifdef HAVE_AVRESAMPLE | |
371 | 402 | int in_linesize = 0; |
372 | 403 | av_samples_get_buffer_size(&in_linesize, avCodecCtx->channels, |
373 | 404 | avFrame->nb_samples, avCodecCtx->sample_fmt, 1); |
377 | 408 | int out_samples = avresample_convert ( avr, |
378 | 409 | (uint8_t **)&output, out_linesize, max_out_samples, |
379 | 410 | (uint8_t **)avFrame->data, in_linesize, in_samples); |
411 | #elif defined(HAVE_SWRESAMPLE) | |
412 | int in_samples = avFrame->nb_samples; | |
413 | int max_out_samples = AUBIO_AVCODEC_MAX_BUFFER_SIZE / avCodecCtx->channels; | |
414 | int out_samples = swr_convert( avr, | |
415 | (uint8_t **)&output, max_out_samples, | |
416 | (const uint8_t **)avFrame->data, in_samples); | |
417 | #endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ | |
380 | 418 | if (out_samples <= 0) { |
381 | 419 | AUBIO_WRN("source_avcodec: no sample found while converting frame (%s)\n", s->path); |
382 | 420 | goto beach; |
388 | 426 | s->avFormatCtx = avFormatCtx; |
389 | 427 | s->avCodecCtx = avCodecCtx; |
390 | 428 | s->avFrame = avFrame; |
429 | #if defined(HAVE_AVRESAMPLE) || defined(HAVE_SWRESAMPLE) | |
391 | 430 | s->avr = avr; |
431 | #endif /* HAVE_AVRESAMPLE || HAVE_SWRESAMPLE */ | |
392 | 432 | s->output = output; |
393 | 433 | |
394 | 434 | av_packet_unref(&avPacket); |
476 | 516 | int64_t min_ts = MAX(resampled_pos - 2000, 0); |
477 | 517 | int64_t max_ts = MIN(resampled_pos + 2000, INT64_MAX); |
478 | 518 | int seek_flags = AVSEEK_FLAG_FRAME | AVSEEK_FLAG_ANY; |
479 | int ret = avformat_seek_file(s->avFormatCtx, s->selected_stream, | |
519 | int ret = AUBIO_FAIL; | |
520 | if (s->avFormatCtx != NULL && s->avr != NULL) { | |
521 | ret = AUBIO_OK; | |
522 | } else { | |
523 | AUBIO_ERR("source_avcodec: failed seeking in %s (file not opened?)", s->path); | |
524 | return ret; | |
525 | } | |
526 | if ((sint_t)pos < 0) { | |
527 | AUBIO_ERR("source_avcodec: could not seek %s at %d (seeking position" | |
528 | " should be >= 0)\n", s->path, pos); | |
529 | return AUBIO_FAIL; | |
530 | } | |
531 | ret = avformat_seek_file(s->avFormatCtx, s->selected_stream, | |
480 | 532 | min_ts, resampled_pos, max_ts, seek_flags); |
481 | 533 | if (ret < 0) { |
482 | AUBIO_ERR("Failed seeking to %d in file %s", pos, s->path); | |
534 | AUBIO_ERR("source_avcodec: failed seeking to %d in file %s", pos, s->path); | |
483 | 535 | } |
484 | 536 | // reset read status |
485 | 537 | s->eof = 0; |
486 | 538 | s->read_index = 0; |
487 | 539 | s->read_samples = 0; |
540 | #ifdef HAVE_AVRESAMPLE | |
488 | 541 | // reset the AVAudioResampleContext |
489 | 542 | avresample_close(s->avr); |
490 | 543 | avresample_open(s->avr); |
544 | #elif defined(HAVE_SWRESAMPLE) | |
545 | swr_close(s->avr); | |
546 | swr_init(s->avr); | |
547 | #endif | |
491 | 548 | return ret; |
492 | 549 | } |
493 | 550 | |
501 | 558 | |
502 | 559 | uint_t aubio_source_avcodec_close(aubio_source_avcodec_t * s) { |
503 | 560 | if (s->avr != NULL) { |
561 | #ifdef HAVE_AVRESAMPLE | |
504 | 562 | avresample_close( s->avr ); |
563 | #elif defined(HAVE_SWRESAMPLE) | |
564 | swr_close ( s->avr ); | |
565 | #endif | |
505 | 566 | av_free ( s->avr ); |
506 | 567 | } |
507 | 568 | s->avr = NULL; |
508 | 569 | if (s->avCodecCtx != NULL) { |
570 | #ifndef HAVE_AUBIO_LIBAVCODEC_DEPRECATED | |
571 | avcodec_free_context( &s->avCodecCtx ); | |
572 | #else | |
509 | 573 | avcodec_close ( s->avCodecCtx ); |
574 | #endif | |
510 | 575 | } |
511 | 576 | s->avCodecCtx = NULL; |
512 | 577 | if (s->avFormatCtx != NULL) { |
513 | 578 | avformat_close_input(&s->avFormatCtx); |
514 | #ifndef HAVE_AUBIO_LIBAVCODEC_DEPRECATED // avoid crash on old libavcodec54 | |
515 | avformat_free_context(s->avFormatCtx); | |
516 | #endif | |
517 | 579 | s->avFormatCtx = NULL; |
518 | 580 | } |
519 | 581 | av_packet_unref(&s->avPacket); |
293 | 293 | |
294 | 294 | uint_t aubio_source_sndfile_seek (aubio_source_sndfile_t * s, uint_t pos) { |
295 | 295 | uint_t resampled_pos = (uint_t)ROUND(pos / s->ratio); |
296 | sf_count_t sf_ret = sf_seek (s->handle, resampled_pos, SEEK_SET); | |
296 | sf_count_t sf_ret; | |
297 | if (s->handle == NULL) { | |
298 | AUBIO_ERR("source_sndfile: failed seeking in %s (file not opened?)\n", | |
299 | s->path); | |
300 | return AUBIO_FAIL; | |
301 | } | |
302 | if ((sint_t)pos < 0) { | |
303 | AUBIO_ERR("source_sndfile: could not seek %s at %d (seeking position" | |
304 | " should be >= 0)\n", s->path, pos); | |
305 | return AUBIO_FAIL; | |
306 | } | |
307 | sf_ret = sf_seek (s->handle, resampled_pos, SEEK_SET); | |
297 | 308 | if (sf_ret == -1) { |
298 | 309 | AUBIO_ERR("source_sndfile: Failed seeking %s at %d: %s\n", s->path, pos, sf_strerror (NULL)); |
299 | 310 | return AUBIO_FAIL; |
308 | 319 | |
309 | 320 | uint_t aubio_source_sndfile_close (aubio_source_sndfile_t *s) { |
310 | 321 | if (!s->handle) { |
311 | return AUBIO_FAIL; | |
322 | return AUBIO_OK; | |
312 | 323 | } |
313 | 324 | if(sf_close(s->handle)) { |
314 | 325 | AUBIO_ERR("source_sndfile: Error closing file %s: %s\n", s->path, sf_strerror (NULL)); |
327 | 327 | uint_t i, j; |
328 | 328 | uint_t end = 0; |
329 | 329 | uint_t total_wrote = 0; |
330 | if (s->fid == NULL) { | |
331 | AUBIO_ERR("source_wavread: could not read from %s (file not opened)\n", | |
332 | s->path); | |
333 | return; | |
334 | } | |
330 | 335 | while (total_wrote < s->hop_size) { |
331 | 336 | end = MIN(s->read_samples - s->read_index, s->hop_size - total_wrote); |
332 | 337 | for (i = 0; i < end; i++) { |
361 | 366 | uint_t i,j; |
362 | 367 | uint_t end = 0; |
363 | 368 | uint_t total_wrote = 0; |
369 | if (s->fid == NULL) { | |
370 | AUBIO_ERR("source_wavread: could not read from %s (file not opened)\n", | |
371 | s->path); | |
372 | return; | |
373 | } | |
364 | 374 | while (total_wrote < s->hop_size) { |
365 | 375 | end = MIN(s->read_samples - s->read_index, s->hop_size - total_wrote); |
366 | 376 | for (j = 0; j < read_data->height; j++) { |
401 | 411 | |
402 | 412 | uint_t aubio_source_wavread_seek (aubio_source_wavread_t * s, uint_t pos) { |
403 | 413 | uint_t ret = 0; |
414 | if (s->fid == NULL) { | |
415 | AUBIO_ERR("source_wavread: could not seek %s (file not opened?)\n", s->path, pos); | |
416 | return AUBIO_FAIL; | |
417 | } | |
404 | 418 | if ((sint_t)pos < 0) { |
419 | AUBIO_ERR("source_wavread: could not seek %s at %d (seeking position should be >= 0)\n", s->path, pos); | |
405 | 420 | return AUBIO_FAIL; |
406 | 421 | } |
407 | 422 | ret = fseek(s->fid, s->seek_start + pos * s->blockalign, SEEK_SET); |
423 | 438 | } |
424 | 439 | |
425 | 440 | uint_t aubio_source_wavread_close (aubio_source_wavread_t * s) { |
426 | if (!s->fid) { | |
427 | return AUBIO_FAIL; | |
441 | if (s->fid == NULL) { | |
442 | return AUBIO_OK; | |
428 | 443 | } |
429 | 444 | if (fclose(s->fid)) { |
430 | 445 | AUBIO_ERR("source_wavread: could not close %s (%s)\n", s->path, strerror(errno)); |
288 | 288 | } |
289 | 289 | } |
290 | 290 | |
291 | void fvec_push(fvec_t *in, smpl_t new_elem) { | |
292 | uint_t i; | |
293 | for (i = 0; i < in->length - 1; i++) { | |
294 | in->data[i] = in->data[i + 1]; | |
295 | } | |
296 | in->data[in->length - 1] = new_elem; | |
297 | } | |
298 | ||
299 | void fvec_clamp(fvec_t *in, smpl_t absmax) { | |
300 | uint_t i; | |
301 | for (i = 0; i < in->length; i++) { | |
302 | if (in->data[i] > 0 && in->data[i] > ABS(absmax)) { | |
303 | in->data[i] = absmax; | |
304 | } else if (in->data[i] < 0 && in->data[i] < -ABS(absmax)) { | |
305 | in->data[i] = -absmax; | |
306 | } | |
307 | } | |
308 | } | |
309 | ||
291 | 310 | smpl_t |
292 | 311 | aubio_level_lin (const fvec_t * f) |
293 | 312 | { |
115 | 115 | |
116 | 116 | */ |
117 | 117 | void fvec_ishift (fvec_t * v); |
118 | ||
119 | /** push a new element to the end of a vector, erasing the first element and | |
120 | * sliding all others | |
121 | ||
122 | \param in vector to push to | |
123 | \param new_elem new_element to add at the end of the vector | |
124 | ||
125 | In numpy words, this is equivalent to: in = np.concatenate([in, [new_elem]])[1:] | |
126 | ||
127 | */ | |
128 | void fvec_push(fvec_t *in, smpl_t new_elem); | |
118 | 129 | |
119 | 130 | /** compute the sum of all elements of a vector |
120 | 131 |
155 | 155 | */ |
156 | 156 | smpl_t aubio_level_detection (const fvec_t * v, smpl_t threshold); |
157 | 157 | |
158 | /** clamp the values of a vector within the range [-abs(max), abs(max)] | |
159 | ||
160 | \param in vector to clamp | |
161 | \param absmax maximum value over which input vector elements should be clamped | |
162 | ||
163 | */ | |
164 | void fvec_clamp(fvec_t *in, smpl_t absmax); | |
165 | ||
158 | 166 | #ifdef __cplusplus |
159 | 167 | } |
160 | 168 | #endif |
22 | 22 | #include "cvec.h" |
23 | 23 | #include "spectral/specdesc.h" |
24 | 24 | #include "spectral/phasevoc.h" |
25 | #include "spectral/awhitening.h" | |
25 | 26 | #include "onset/peakpicker.h" |
26 | 27 | #include "mathutils.h" |
27 | 28 | #include "onset/onset.h" |
29 | ||
30 | void aubio_onset_default_parameters (aubio_onset_t *o, const char_t * method); | |
28 | 31 | |
29 | 32 | /** structure to store object state */ |
30 | 33 | struct _aubio_onset_t { |
41 | 44 | |
42 | 45 | uint_t total_frames; /**< total number of frames processed since the beginning */ |
43 | 46 | uint_t last_onset; /**< last detected onset location, in frames */ |
47 | ||
48 | uint_t apply_compression; | |
49 | smpl_t lambda_compression; | |
50 | uint_t apply_awhitening; /**< apply adaptive spectral whitening */ | |
51 | aubio_spectral_whitening_t *spectral_whitening; | |
44 | 52 | }; |
45 | 53 | |
46 | 54 | /* execute onset detection function on iput buffer */ |
48 | 56 | { |
49 | 57 | smpl_t isonset = 0; |
50 | 58 | aubio_pvoc_do (o->pv,input, o->fftgrain); |
59 | /* | |
60 | if (apply_filtering) { | |
61 | } | |
62 | */ | |
63 | if (o->apply_awhitening) { | |
64 | aubio_spectral_whitening_do(o->spectral_whitening, o->fftgrain); | |
65 | } | |
66 | if (o->apply_compression) { | |
67 | cvec_logmag(o->fftgrain, o->lambda_compression); | |
68 | } | |
51 | 69 | aubio_specdesc_do (o->od, o->fftgrain, o->desc); |
52 | 70 | aubio_peakpicker_do(o->pp, o->desc, onset); |
53 | 71 | isonset = onset->data[0]; |
56 | 74 | //AUBIO_DBG ("silent onset, not marking as onset\n"); |
57 | 75 | isonset = 0; |
58 | 76 | } else { |
77 | // we have an onset | |
59 | 78 | uint_t new_onset = o->total_frames + (uint_t)ROUND(isonset * o->hop_size); |
79 | // check if last onset time was more than minioi ago | |
60 | 80 | if (o->last_onset + o->minioi < new_onset) { |
61 | //AUBIO_DBG ("accepted detection, marking as onset\n"); | |
62 | o->last_onset = new_onset; | |
81 | // start of file: make sure (new_onset - delay) >= 0 | |
82 | if (o->last_onset > 0 && o->delay > new_onset) { | |
83 | isonset = 0; | |
84 | } else { | |
85 | //AUBIO_DBG ("accepted detection, marking as onset\n"); | |
86 | o->last_onset = MAX(o->delay, new_onset); | |
87 | } | |
63 | 88 | } else { |
64 | 89 | //AUBIO_DBG ("doubled onset, not marking as onset\n"); |
65 | 90 | isonset = 0; |
98 | 123 | return aubio_onset_get_last_s (o) * 1000.; |
99 | 124 | } |
100 | 125 | |
126 | uint_t aubio_onset_set_awhitening (aubio_onset_t *o, uint_t enable) | |
127 | { | |
128 | o->apply_awhitening = enable == 1 ? 1 : 0; | |
129 | return AUBIO_OK; | |
130 | } | |
131 | ||
132 | smpl_t aubio_onset_get_awhitening (aubio_onset_t *o) | |
133 | { | |
134 | return o->apply_awhitening; | |
135 | } | |
136 | ||
137 | uint_t aubio_onset_set_compression (aubio_onset_t *o, smpl_t lambda) | |
138 | { | |
139 | if (lambda < 0.) { | |
140 | return AUBIO_FAIL; | |
141 | } | |
142 | o->lambda_compression = lambda; | |
143 | o->apply_compression = (o->lambda_compression > 0.) ? 1 : 0; | |
144 | return AUBIO_OK; | |
145 | } | |
146 | ||
147 | smpl_t aubio_onset_get_compression (aubio_onset_t *o) | |
148 | { | |
149 | return o->apply_compression ? o->lambda_compression : 0; | |
150 | } | |
151 | ||
101 | 152 | uint_t aubio_onset_set_silence(aubio_onset_t * o, smpl_t silence) { |
102 | 153 | o->silence = silence; |
103 | 154 | return AUBIO_OK; |
207 | 258 | if (o->od == NULL) goto beach_specdesc; |
208 | 259 | o->fftgrain = new_cvec(buf_size); |
209 | 260 | o->desc = new_fvec(1); |
210 | ||
211 | /* set some default parameter */ | |
212 | aubio_onset_set_threshold (o, 0.3); | |
213 | aubio_onset_set_delay(o, 4.3 * hop_size); | |
214 | aubio_onset_set_minioi_ms(o, 20.); | |
215 | aubio_onset_set_silence(o, -70.); | |
261 | o->spectral_whitening = new_aubio_spectral_whitening(buf_size, hop_size, samplerate); | |
216 | 262 | |
217 | 263 | /* initialize internal variables */ |
218 | o->last_onset = 0; | |
219 | o->total_frames = 0; | |
264 | aubio_onset_set_default_parameters (o, onset_mode); | |
265 | ||
266 | aubio_onset_reset(o); | |
220 | 267 | return o; |
221 | 268 | |
222 | 269 | beach_specdesc: |
227 | 274 | return NULL; |
228 | 275 | } |
229 | 276 | |
277 | void aubio_onset_reset (aubio_onset_t *o) { | |
278 | o->last_onset = 0; | |
279 | o->total_frames = 0; | |
280 | } | |
281 | ||
282 | uint_t aubio_onset_set_default_parameters (aubio_onset_t * o, const char_t * onset_mode) | |
283 | { | |
284 | uint_t ret = AUBIO_OK; | |
285 | /* set some default parameter */ | |
286 | aubio_onset_set_threshold (o, 0.3); | |
287 | aubio_onset_set_delay (o, 4.3 * o->hop_size); | |
288 | aubio_onset_set_minioi_ms (o, 50.); | |
289 | aubio_onset_set_silence (o, -70.); | |
290 | // disable spectral whitening | |
291 | aubio_onset_set_awhitening (o, 0); | |
292 | // disable logarithmic magnitude | |
293 | aubio_onset_set_compression (o, 0.); | |
294 | ||
295 | /* method specific optimisations */ | |
296 | if (strcmp (onset_mode, "energy") == 0) { | |
297 | } else if (strcmp (onset_mode, "hfc") == 0 || strcmp (onset_mode, "default") == 0) { | |
298 | aubio_onset_set_threshold (o, 0.058); | |
299 | aubio_onset_set_compression (o, 1.); | |
300 | } else if (strcmp (onset_mode, "complexdomain") == 0 | |
301 | || strcmp (onset_mode, "complex") == 0) { | |
302 | aubio_onset_set_delay (o, 4.6 * o->hop_size); | |
303 | aubio_onset_set_threshold (o, 0.15); | |
304 | aubio_onset_set_awhitening(o, 1); | |
305 | aubio_onset_set_compression (o, 1.); | |
306 | } else if (strcmp (onset_mode, "phase") == 0) { | |
307 | o->apply_compression = 0; | |
308 | aubio_onset_set_awhitening (o, 0); | |
309 | } else if (strcmp (onset_mode, "mkl") == 0) { | |
310 | aubio_onset_set_threshold (o, 0.05); | |
311 | aubio_onset_set_awhitening(o, 1); | |
312 | aubio_onset_set_compression (o, 0.02); | |
313 | } else if (strcmp (onset_mode, "kl") == 0) { | |
314 | aubio_onset_set_threshold (o, 0.35); | |
315 | aubio_onset_set_awhitening(o, 1); | |
316 | aubio_onset_set_compression (o, 0.02); | |
317 | } else if (strcmp (onset_mode, "specflux") == 0) { | |
318 | aubio_onset_set_threshold (o, 0.18); | |
319 | aubio_onset_set_awhitening(o, 1); | |
320 | aubio_spectral_whitening_set_relax_time(o->spectral_whitening, 100); | |
321 | aubio_spectral_whitening_set_floor(o->spectral_whitening, 1.); | |
322 | aubio_onset_set_compression (o, 10.); | |
323 | } else if (strcmp (onset_mode, "specdiff") == 0) { | |
324 | } else if (strcmp (onset_mode, "old_default") == 0) { | |
325 | // used to reproduce results obtained with the previous version | |
326 | aubio_onset_set_threshold (o, 0.3); | |
327 | aubio_onset_set_minioi_ms (o, 20.); | |
328 | aubio_onset_set_compression (o, 0.); | |
329 | } else { | |
330 | AUBIO_WRN("onset: unknown spectral descriptor type %s, " | |
331 | "using default parameters.\n", onset_mode); | |
332 | ret = AUBIO_FAIL; | |
333 | } | |
334 | return ret; | |
335 | } | |
336 | ||
230 | 337 | void del_aubio_onset (aubio_onset_t *o) |
231 | 338 | { |
339 | del_aubio_spectral_whitening(o->spectral_whitening); | |
232 | 340 | del_aubio_specdesc(o->od); |
233 | 341 | del_aubio_peakpicker(o->pp); |
234 | 342 | del_aubio_pvoc(o->pv); |
116 | 116 | */ |
117 | 117 | smpl_t aubio_onset_get_last_ms (const aubio_onset_t *o); |
118 | 118 | |
119 | /** set onset detection adaptive whitening | |
120 | ||
121 | \param o onset detection object as returned by new_aubio_onset() | |
122 | \param enable 1 to enable, 0 to disable | |
123 | ||
124 | \return 0 if successful, 1 otherwise | |
125 | ||
126 | */ | |
127 | uint_t aubio_onset_set_awhitening(aubio_onset_t * o, uint_t enable); | |
128 | ||
129 | /** get onset detection adaptive whitening | |
130 | ||
131 | \param o onset detection object as returned by new_aubio_onset() | |
132 | ||
133 | \return 1 if enabled, 0 otherwise | |
134 | ||
135 | */ | |
136 | smpl_t aubio_onset_get_awhitening(aubio_onset_t * o); | |
137 | ||
138 | /** set or disable log compression | |
139 | ||
140 | \param o onset detection object as returned by new_aubio_onset() | |
141 | \param lambda logarithmic compression factor, 0 to disable | |
142 | ||
143 | \return 0 if successful, 1 otherwise | |
144 | ||
145 | */ | |
146 | uint_t aubio_onset_set_compression(aubio_onset_t *o, smpl_t lambda); | |
147 | ||
148 | /** get onset detection log compression | |
149 | ||
150 | \param o onset detection object as returned by new_aubio_onset() | |
151 | ||
152 | \returns 0 if disabled, compression factor otherwise | |
153 | ||
154 | */ | |
155 | smpl_t aubio_onset_get_compression(aubio_onset_t *o); | |
156 | ||
119 | 157 | /** set onset detection silence threshold |
120 | 158 | |
121 | 159 | \param o onset detection object as returned by new_aubio_onset() |
272 | 310 | |
273 | 311 | */ |
274 | 312 | smpl_t aubio_onset_get_threshold(const aubio_onset_t * o); |
313 | ||
314 | /** set default parameters | |
315 | ||
316 | \param o onset detection object as returned by new_aubio_onset() | |
317 | \param onset_mode detection mode to adjust | |
318 | ||
319 | This function is called at the end of new_aubio_onset(). | |
320 | ||
321 | */ | |
322 | uint_t aubio_onset_set_default_parameters (aubio_onset_t * o, const char_t * onset_mode); | |
323 | ||
324 | /** reset onset detection | |
325 | ||
326 | \param o onset detection object as returned by new_aubio_onset() | |
327 | ||
328 | Reset current time and last onset to 0. | |
329 | ||
330 | This function is called at the end of new_aubio_onset(). | |
331 | ||
332 | */ | |
333 | void aubio_onset_reset(aubio_onset_t * o); | |
275 | 334 | |
276 | 335 | /** delete onset detection object |
277 | 336 |
91 | 91 | fvec_t *thresholded = p->thresholded; |
92 | 92 | fvec_t *scratch = p->scratch; |
93 | 93 | smpl_t mean = 0., median = 0.; |
94 | uint_t length = p->win_post + p->win_pre + 1; | |
95 | 94 | uint_t j = 0; |
96 | 95 | |
97 | /* store onset in onset_keep */ | |
98 | /* shift all elements but last, then write last */ | |
99 | for (j = 0; j < length - 1; j++) { | |
100 | onset_keep->data[j] = onset_keep->data[j + 1]; | |
101 | onset_proc->data[j] = onset_keep->data[j]; | |
102 | } | |
103 | onset_keep->data[length - 1] = onset->data[0]; | |
104 | onset_proc->data[length - 1] = onset->data[0]; | |
96 | /* push new novelty to the end */ | |
97 | fvec_push(onset_keep, onset->data[0]); | |
98 | /* store a copy */ | |
99 | fvec_copy(onset_keep, onset_proc); | |
105 | 100 | |
106 | /* filter onset_proc */ | |
107 | /** \bug filtfilt calculated post+pre times, should be only once !? */ | |
101 | /* filter this copy */ | |
108 | 102 | aubio_filter_do_filtfilt (p->biquad, onset_proc, scratch); |
109 | 103 | |
110 | 104 | /* calculate mean and median for onset_proc */ |
111 | 105 | mean = fvec_mean (onset_proc); |
112 | /* copy to scratch */ | |
113 | for (j = 0; j < length; j++) | |
114 | scratch->data[j] = onset_proc->data[j]; | |
106 | ||
107 | /* copy to scratch and compute its median */ | |
108 | fvec_copy(onset_proc, scratch); | |
115 | 109 | median = p->thresholdfn (scratch); |
116 | 110 | |
117 | 111 | /* shift peek array */ |
184 | 178 | generated with octave butter function: [b,a] = butter(2, 0.34); |
185 | 179 | */ |
186 | 180 | t->biquad = new_aubio_filter_biquad (0.15998789, 0.31997577, 0.15998789, |
187 | -0.59488894, 0.23484048); | |
181 | // FIXME: broken since c9e20ca, revert for now | |
182 | //-0.59488894, 0.23484048); | |
183 | 0.23484048, 0); | |
188 | 184 | |
189 | 185 | return t; |
190 | 186 | } |
110 | 110 | { |
111 | 111 | aubio_pitch_t *p = AUBIO_NEW (aubio_pitch_t); |
112 | 112 | aubio_pitch_type pitch_type; |
113 | if (pitch_mode == NULL) { | |
114 | AUBIO_ERR ("pitch: can not use ‘NULL‘ for pitch detection method\n"); | |
115 | goto beach; | |
116 | } | |
113 | 117 | if (strcmp (pitch_mode, "mcomb") == 0) |
114 | 118 | pitch_type = aubio_pitcht_mcomb; |
115 | 119 | else if (strcmp (pitch_mode, "yinfft") == 0) |
154 | 158 | case aubio_pitcht_yin: |
155 | 159 | p->buf = new_fvec (bufsize); |
156 | 160 | p->p_object = new_aubio_pitchyin (bufsize); |
161 | if (!p->p_object) goto beach; | |
157 | 162 | p->detect_cb = aubio_pitch_do_yin; |
158 | 163 | p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchyin_get_confidence; |
159 | 164 | aubio_pitchyin_set_tolerance (p->p_object, 0.15); |
161 | 166 | case aubio_pitcht_mcomb: |
162 | 167 | p->filtered = new_fvec (hopsize); |
163 | 168 | p->pv = new_aubio_pvoc (bufsize, hopsize); |
169 | if (!p->pv) goto beach; | |
164 | 170 | p->fftgrain = new_cvec (bufsize); |
165 | 171 | p->p_object = new_aubio_pitchmcomb (bufsize, hopsize); |
166 | 172 | p->filter = new_aubio_filter_c_weighting (samplerate); |
169 | 175 | case aubio_pitcht_fcomb: |
170 | 176 | p->buf = new_fvec (bufsize); |
171 | 177 | p->p_object = new_aubio_pitchfcomb (bufsize, hopsize); |
178 | if (!p->p_object) goto beach; | |
172 | 179 | p->detect_cb = aubio_pitch_do_fcomb; |
173 | 180 | break; |
174 | 181 | case aubio_pitcht_schmitt: |
179 | 186 | case aubio_pitcht_yinfft: |
180 | 187 | p->buf = new_fvec (bufsize); |
181 | 188 | p->p_object = new_aubio_pitchyinfft (samplerate, bufsize); |
189 | if (!p->p_object) goto beach; | |
182 | 190 | p->detect_cb = aubio_pitch_do_yinfft; |
183 | 191 | p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchyinfft_get_confidence; |
184 | 192 | aubio_pitchyinfft_set_tolerance (p->p_object, 0.85); |
186 | 194 | case aubio_pitcht_specacf: |
187 | 195 | p->buf = new_fvec (bufsize); |
188 | 196 | p->p_object = new_aubio_pitchspecacf (bufsize); |
197 | if (!p->p_object) goto beach; | |
189 | 198 | p->detect_cb = aubio_pitch_do_specacf; |
190 | 199 | p->conf_cb = (aubio_pitch_get_conf_t)aubio_pitchspecacf_get_tolerance; |
191 | 200 | aubio_pitchspecacf_set_tolerance (p->p_object, 0.85); |
196 | 205 | return p; |
197 | 206 | |
198 | 207 | beach: |
208 | if (p->filtered) del_fvec(p->filtered); | |
209 | if (p->buf) del_fvec(p->buf); | |
199 | 210 | AUBIO_FREE(p); |
200 | 211 | return NULL; |
201 | 212 | } |
52 | 52 | aubio_pitchfcomb_t *p = AUBIO_NEW (aubio_pitchfcomb_t); |
53 | 53 | p->fftSize = bufsize; |
54 | 54 | p->stepSize = hopsize; |
55 | p->fft = new_aubio_fft (bufsize); | |
56 | if (!p->fft) goto beach; | |
55 | 57 | p->winput = new_fvec (bufsize); |
56 | 58 | p->fftOut = new_cvec (bufsize); |
57 | 59 | p->fftLastPhase = new_fvec (bufsize); |
58 | p->fft = new_aubio_fft (bufsize); | |
59 | 60 | p->win = new_aubio_window ("hanning", bufsize); |
60 | 61 | return p; |
62 | ||
63 | beach: | |
64 | AUBIO_FREE(p); | |
65 | return NULL; | |
61 | 66 | } |
62 | 67 | |
63 | 68 | /* input must be stepsize long */ |
41 | 41 | new_aubio_pitchspecacf (uint_t bufsize) |
42 | 42 | { |
43 | 43 | aubio_pitchspecacf_t *p = AUBIO_NEW (aubio_pitchspecacf_t); |
44 | p->fft = new_aubio_fft (bufsize); | |
45 | if (!p->fft) goto beach; | |
44 | 46 | p->win = new_aubio_window ("hanningz", bufsize); |
45 | 47 | p->winput = new_fvec (bufsize); |
46 | p->fft = new_aubio_fft (bufsize); | |
47 | 48 | p->fftout = new_fvec (bufsize); |
48 | 49 | p->sqrmag = new_fvec (bufsize); |
49 | 50 | p->acf = new_fvec (bufsize / 2 + 1); |
50 | 51 | p->tol = 1.; |
51 | 52 | p->confidence = 0.; |
52 | 53 | return p; |
54 | ||
55 | beach: | |
56 | AUBIO_FREE(p); | |
57 | return NULL; | |
53 | 58 | } |
54 | 59 | |
55 | 60 | void |
61 | 61 | aubio_pitchyinfft_t *p = AUBIO_NEW (aubio_pitchyinfft_t); |
62 | 62 | p->winput = new_fvec (bufsize); |
63 | 63 | p->fft = new_aubio_fft (bufsize); |
64 | if (!p->fft) goto beach; | |
64 | 65 | p->fftout = new_fvec (bufsize); |
65 | 66 | p->sqrmag = new_fvec (bufsize); |
66 | 67 | p->yinfft = new_fvec (bufsize / 2 + 1); |
94 | 95 | // check for octave errors above 1300 Hz |
95 | 96 | p->short_period = (uint_t)ROUND(samplerate / 1300.); |
96 | 97 | return p; |
98 | ||
99 | beach: | |
100 | if (p->winput) del_fvec(p->winput); | |
101 | AUBIO_FREE(p); | |
102 | return NULL; | |
97 | 103 | } |
98 | 104 | |
99 | 105 | void |
0 | /* | |
1 | * Copyright (C) 2003-2015 Paul Brossier <piem@aubio.org> | |
2 | * | |
3 | * This file is part of aubio. | |
4 | * | |
5 | * aubio is free software: you can redistribute it and/or modify it under the | |
6 | * terms of the GNU General Public License as published by the Free Software | |
7 | * Foundation, either version 3 of the License, or (at your option) any later | |
8 | * version. | |
9 | * | |
10 | * aubio is distributed in the hope that it will be useful, but WITHOUT ANY | |
11 | * WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS | |
12 | * FOR A PARTICULAR PURPOSE. See the GNU General Public License for more | |
13 | * details. | |
14 | * | |
15 | * You should have received a copy of the GNU General Public License along with | |
16 | * aubio. If not, see <http://www.gnu.org/licenses/>. | |
17 | * | |
18 | */ | |
19 | ||
20 | #include "aubio_priv.h" | |
21 | #include "fvec.h" | |
22 | #include "cvec.h" | |
23 | #include "mathutils.h" | |
24 | #include "spectral/awhitening.h" | |
25 | ||
26 | #define aubio_spectral_whitening_default_relax_time 250 // in seconds, between 22 and 446 | |
27 | #define aubio_spectral_whitening_default_decay 0.001 // -60dB attenuation | |
28 | #define aubio_spectral_whitening_default_floor 1.e-4 // from 1.e-6 to .2 | |
29 | ||
30 | /** structure to store object state */ | |
31 | struct _aubio_spectral_whitening_t { | |
32 | uint_t buf_size; | |
33 | uint_t hop_size; | |
34 | uint_t samplerate; | |
35 | smpl_t relax_time; | |
36 | smpl_t r_decay; | |
37 | smpl_t floor; | |
38 | fvec_t *peak_values; | |
39 | }; | |
40 | ||
41 | void | |
42 | aubio_spectral_whitening_do (aubio_spectral_whitening_t * o, cvec_t * fftgrain) | |
43 | { | |
44 | uint_t i = 0; | |
45 | for (i = 0; i < o->peak_values->length; i++) { | |
46 | smpl_t tmp = MAX(o->r_decay * o->peak_values->data[i], o->floor); | |
47 | o->peak_values->data[i] = MAX(fftgrain->norm[i], tmp); | |
48 | fftgrain->norm[i] /= o->peak_values->data[i]; | |
49 | } | |
50 | } | |
51 | ||
52 | aubio_spectral_whitening_t * | |
53 | new_aubio_spectral_whitening (uint_t buf_size, uint_t hop_size, uint_t samplerate) | |
54 | { | |
55 | aubio_spectral_whitening_t *o = AUBIO_NEW (aubio_spectral_whitening_t); | |
56 | if ((sint_t)buf_size < 1) { | |
57 | AUBIO_ERR("spectral_whitening: got buffer_size %d, but can not be < 1\n", buf_size); | |
58 | goto beach; | |
59 | } else if ((sint_t)hop_size < 1) { | |
60 | AUBIO_ERR("spectral_whitening: got hop_size %d, but can not be < 1\n", hop_size); | |
61 | goto beach; | |
62 | } else if ((sint_t)samplerate < 1) { | |
63 | AUBIO_ERR("spectral_whitening: got samplerate %d, but can not be < 1\n", samplerate); | |
64 | goto beach; | |
65 | } | |
66 | o->peak_values = new_fvec (buf_size / 2 + 1); | |
67 | o->buf_size = buf_size; | |
68 | o->hop_size = hop_size; | |
69 | o->samplerate = samplerate; | |
70 | o->floor = aubio_spectral_whitening_default_floor; | |
71 | aubio_spectral_whitening_set_relax_time (o, aubio_spectral_whitening_default_relax_time); | |
72 | aubio_spectral_whitening_reset (o); | |
73 | return o; | |
74 | ||
75 | beach: | |
76 | AUBIO_FREE(o); | |
77 | return NULL; | |
78 | } | |
79 | ||
80 | uint_t | |
81 | aubio_spectral_whitening_set_relax_time (aubio_spectral_whitening_t * o, smpl_t relax_time) | |
82 | { | |
83 | o->relax_time = relax_time; | |
84 | o->r_decay = POW (aubio_spectral_whitening_default_decay, | |
85 | (o->hop_size / (float) o->samplerate) / o->relax_time); | |
86 | return AUBIO_OK; | |
87 | } | |
88 | ||
89 | smpl_t | |
90 | aubio_spectral_whitening_get_relax_time (aubio_spectral_whitening_t * o) | |
91 | { | |
92 | return o->relax_time; | |
93 | } | |
94 | ||
95 | uint_t | |
96 | aubio_spectral_whitening_set_floor (aubio_spectral_whitening_t *o, smpl_t floor) | |
97 | { | |
98 | o->floor = floor; | |
99 | return AUBIO_OK; | |
100 | } | |
101 | ||
102 | smpl_t aubio_spectral_whitening_get_floor (aubio_spectral_whitening_t *o) | |
103 | { | |
104 | return o->floor; | |
105 | } | |
106 | ||
107 | void | |
108 | aubio_spectral_whitening_reset (aubio_spectral_whitening_t * o) | |
109 | { | |
110 | /* cover the case n == 0. */ | |
111 | fvec_set_all (o->peak_values, o->floor); | |
112 | } | |
113 | ||
114 | void | |
115 | del_aubio_spectral_whitening (aubio_spectral_whitening_t * o) | |
116 | { | |
117 | del_fvec (o->peak_values); | |
118 | AUBIO_FREE (o); | |
119 | } |
0 | /* | |
1 | Copyright (C) 2003-2015 Paul Brossier <piem@aubio.org> | |
2 | ||
3 | This file is part of aubio. | |
4 | ||
5 | aubio is free software: you can redistribute it and/or modify | |
6 | it under the terms of the GNU General Public License as published by | |
7 | the Free Software Foundation, either version 3 of the License, or | |
8 | (at your option) any later version. | |
9 | ||
10 | aubio is distributed in the hope that it will be useful, | |
11 | but WITHOUT ANY WARRANTY; without even the implied warranty of | |
12 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
13 | GNU General Public License for more details. | |
14 | ||
15 | You should have received a copy of the GNU General Public License | |
16 | along with aubio. If not, see <http://www.gnu.org/licenses/>. | |
17 | ||
18 | */ | |
19 | ||
20 | /** \file | |
21 | ||
22 | Spectral adaptive whitening | |
23 | ||
24 | References: | |
25 | ||
26 | D. Stowell and M. D. Plumbley. Adaptive whitening for improved real-time | |
27 | audio onset detection. In Proceedings of the International Computer Music | |
28 | Conference (ICMC), 2007, Copenhagen, Denmark. | |
29 | ||
30 | http://www.eecs.qmul.ac.uk/~markp/2007/StowellPlumbley07-icmc.pdf | |
31 | ||
32 | S. Böck,, F. Krebs, and M. Schedl. Evaluating the Online Capabilities of | |
33 | Onset Detection Methods. In Proceedings of the 13th International Society for | |
34 | Music Information Retrieval Conference (ISMIR), 2012, Porto, Portugal. | |
35 | ||
36 | http://ismir2012.ismir.net/event/papers/049_ISMIR_2012.pdf | |
37 | http://www.cp.jku.at/research/papers/Boeck_etal_ISMIR_2012.pdf | |
38 | ||
39 | */ | |
40 | ||
41 | ||
42 | #ifndef _AUBIO_SPECTRAL_WHITENING_H | |
43 | #define _AUBIO_SPECTRAL_WHITENING_H | |
44 | ||
45 | #ifdef __cplusplus | |
46 | extern "C" { | |
47 | #endif | |
48 | ||
49 | /** spectral whitening structure */ | |
50 | typedef struct _aubio_spectral_whitening_t aubio_spectral_whitening_t; | |
51 | ||
52 | /** execute spectral adaptive whitening, in-place | |
53 | ||
54 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
55 | \param fftgrain input signal spectrum as computed by aubio_pvoc_do() or aubio_fft_do() | |
56 | ||
57 | */ | |
58 | void aubio_spectral_whitening_do (aubio_spectral_whitening_t * o, | |
59 | cvec_t * fftgrain); | |
60 | ||
61 | /** creation of a spectral whitening object | |
62 | ||
63 | \param buf_size window size of input grains | |
64 | \param hop_size number of samples between two consecutive input grains | |
65 | \param samplerate sampling rate of the input signal | |
66 | ||
67 | */ | |
68 | aubio_spectral_whitening_t *new_aubio_spectral_whitening (uint_t buf_size, | |
69 | uint_t hop_size, | |
70 | uint_t samplerate); | |
71 | ||
72 | /** reset spectral whitening object | |
73 | ||
74 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
75 | ||
76 | */ | |
77 | void aubio_spectral_whitening_reset (aubio_spectral_whitening_t * o); | |
78 | ||
79 | /** set relaxation time for spectral whitening | |
80 | ||
81 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
82 | \param relax_time relaxation time in seconds between 20 and 500, defaults 250 | |
83 | ||
84 | */ | |
85 | uint_t aubio_spectral_whitening_set_relax_time (aubio_spectral_whitening_t * o, | |
86 | smpl_t relax_time); | |
87 | ||
88 | /** get relaxation time of spectral whitening | |
89 | ||
90 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
91 | \return relaxation time in seconds | |
92 | ||
93 | */ | |
94 | smpl_t aubio_spectral_whitening_get_relax_time (aubio_spectral_whitening_t * o); | |
95 | ||
96 | /** set floor for spectral whitening | |
97 | ||
98 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
99 | \param floor value (typically between 1.e-6 and .2, defaults to 1.e-4) | |
100 | ||
101 | */ | |
102 | uint_t aubio_spectral_whitening_set_floor (aubio_spectral_whitening_t * o, | |
103 | smpl_t floor); | |
104 | ||
105 | /** get floor of spectral whitening | |
106 | ||
107 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
108 | \return floor value | |
109 | ||
110 | */ | |
111 | smpl_t aubio_spectral_whitening_get_floor (aubio_spectral_whitening_t * o); | |
112 | ||
113 | /** deletion of a spectral whitening | |
114 | ||
115 | \param o spectral whitening object as returned by new_aubio_spectral_whitening() | |
116 | ||
117 | */ | |
118 | void del_aubio_spectral_whitening (aubio_spectral_whitening_t * o); | |
119 | ||
120 | #ifdef __cplusplus | |
121 | } | |
122 | #endif | |
123 | ||
124 | #endif /* _AUBIO_SPECTRAL_WHITENING_H */ |
29 | 29 | void aubio_specdesc_hfc(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); |
30 | 30 | void aubio_specdesc_complex(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); |
31 | 31 | void aubio_specdesc_phase(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); |
32 | void aubio_specdesc_wphase(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); | |
32 | 33 | void aubio_specdesc_specdiff(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); |
33 | 34 | void aubio_specdesc_kl(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); |
34 | 35 | void aubio_specdesc_mkl(aubio_specdesc_t *o, const cvec_t * fftgrain, fvec_t * onset); |
56 | 57 | aubio_onset_hfc, /**< high frequency content */ |
57 | 58 | aubio_onset_complex, /**< complex domain */ |
58 | 59 | aubio_onset_phase, /**< phase fast */ |
60 | aubio_onset_wphase, /**< weighted phase */ | |
59 | 61 | aubio_onset_kl, /**< Kullback Liebler */ |
60 | 62 | aubio_onset_mkl, /**< modified Kullback Liebler */ |
61 | 63 | aubio_onset_specflux, /**< spectral flux */ |
156 | 158 | /* its mean is the result */ |
157 | 159 | onset->data[0] = aubio_hist_mean(o->histog); |
158 | 160 | //onset->data[0] = fvec_mean(o->dev1); |
161 | } | |
162 | ||
163 | /* weighted phase */ | |
164 | void | |
165 | aubio_specdesc_wphase(aubio_specdesc_t *o, | |
166 | const cvec_t *fftgrain, fvec_t *onset) { | |
167 | uint_t i; | |
168 | aubio_specdesc_phase(o, fftgrain, onset); | |
169 | for (i = 0; i < fftgrain->length; i++) { | |
170 | o->dev1->data[i] *= fftgrain->norm[i]; | |
171 | } | |
172 | /* apply o->histogram */ | |
173 | aubio_hist_dyn_notnull(o->histog,o->dev1); | |
174 | /* weight it */ | |
175 | aubio_hist_weight(o->histog); | |
176 | /* its mean is the result */ | |
177 | onset->data[0] = aubio_hist_mean(o->histog); | |
159 | 178 | } |
160 | 179 | |
161 | 180 | /* Spectral difference method onset detection function */ |
249 | 268 | onset_type = aubio_onset_complex; |
250 | 269 | else if (strcmp (onset_mode, "phase") == 0) |
251 | 270 | onset_type = aubio_onset_phase; |
271 | else if (strcmp (onset_mode, "wphase") == 0) | |
272 | onset_type = aubio_onset_wphase; | |
252 | 273 | else if (strcmp (onset_mode, "mkl") == 0) |
253 | 274 | onset_type = aubio_onset_mkl; |
254 | 275 | else if (strcmp (onset_mode, "kl") == 0) |
269 | 290 | onset_type = aubio_specmethod_decrease; |
270 | 291 | else if (strcmp (onset_mode, "rolloff") == 0) |
271 | 292 | onset_type = aubio_specmethod_rolloff; |
293 | else if (strcmp (onset_mode, "old_default") == 0) | |
294 | onset_type = aubio_onset_default; | |
272 | 295 | else if (strcmp (onset_mode, "default") == 0) |
273 | 296 | onset_type = aubio_onset_default; |
274 | 297 | else { |
290 | 313 | o->theta2 = new_fvec(rsize); |
291 | 314 | break; |
292 | 315 | case aubio_onset_phase: |
316 | case aubio_onset_wphase: | |
293 | 317 | o->dev1 = new_fvec(rsize); |
294 | 318 | o->theta1 = new_fvec(rsize); |
295 | 319 | o->theta2 = new_fvec(rsize); |
324 | 348 | case aubio_onset_phase: |
325 | 349 | o->funcpointer = aubio_specdesc_phase; |
326 | 350 | break; |
351 | case aubio_onset_wphase: | |
352 | o->funcpointer = aubio_specdesc_wphase; | |
353 | break; | |
327 | 354 | case aubio_onset_specdiff: |
328 | 355 | o->funcpointer = aubio_specdesc_specdiff; |
329 | 356 | break; |
377 | 404 | del_fvec(o->theta2); |
378 | 405 | break; |
379 | 406 | case aubio_onset_phase: |
407 | case aubio_onset_wphase: | |
380 | 408 | del_fvec(o->dev1); |
381 | 409 | del_fvec(o->theta1); |
382 | 410 | del_fvec(o->theta2); |
58 | 58 | Conference on Acoustics Speech and Signal Processing, pages 441Â444, |
59 | 59 | Hong-Kong, 2003. |
60 | 60 | |
61 | \b \p wphase : Weighted Phase Deviation onset detection function | |
62 | ||
63 | S. Dixon. Onset detection revisited. In Proceedings of the 9th International | |
64 | Conference on Digital Audio Ef- fects (DAFx) , pages 133–137, 2006. | |
65 | ||
66 | http://www.eecs.qmul.ac.uk/~simond/pub/2006/dafx.pdf | |
67 | ||
61 | 68 | \b \p specdiff : Spectral difference method onset detection function |
62 | 69 | |
63 | 70 | Jonhatan Foote and Shingo Uchihashi. The beat spectrum: a new approach to |
173 | 180 | |
174 | 181 | The parameter \p method is a string that can be any of: |
175 | 182 | |
176 | - `energy`, `hfc`, `complex`, `phase`, `specdiff`, `kl`, `mkl`, `specflux` | |
177 | - `centroid`, `spread`, `skewness`, `kurtosis`, `slope`, `decrease`, `rolloff` | |
183 | - onset novelty functions: `complex`, `energy`, `hfc`, `kl`, `mkl`, | |
184 | `phase`, `specdiff`, `specflux`, `wphase`, | |
185 | ||
186 | - spectral descriptors: `centroid`, `decrease`, `kurtosis`, `rolloff`, | |
187 | `skewness`, `slope`, `spread`. | |
178 | 188 | |
179 | 189 | */ |
180 | 190 | aubio_specdesc_t *new_aubio_specdesc (const char_t * method, uint_t buf_size); |
102 | 102 | for (i = 0; i < output->length; i++) { |
103 | 103 | output->data[i] += input->data[i]; |
104 | 104 | } |
105 | fvec_clamp(output, 1.); | |
105 | 106 | } |
106 | 107 | } |
107 | 108 |
153 | 153 | |
154 | 154 | \param o beat tracking object |
155 | 155 | |
156 | \return confidence with which the tempo has been observed, `0` if no | |
157 | consistent value is found. | |
156 | \return confidence with which the tempo has been observed, the higher the | |
157 | more confidence, `0` if no consistent value is found. | |
158 | 158 | |
159 | 159 | */ |
160 | 160 | smpl_t aubio_tempo_get_confidence(aubio_tempo_t * o); |
40 | 40 | bs->data[2] = b2; |
41 | 41 | as->data[0] = 1.; |
42 | 42 | as->data[1] = a1; |
43 | as->data[1] = a2; | |
43 | as->data[2] = a2; | |
44 | 44 | return AUBIO_OK; |
45 | 45 | } |
46 | 46 |
40 | 40 | |
41 | 41 | #include "aubio.h" |
42 | 42 | |
43 | BOOL APIENTRY DllMain( HMODULE hModule, | |
43 | BOOL APIENTRY DllMain( HMODULE hModule UNUSED, | |
44 | 44 | DWORD ul_reason_for_call, |
45 | LPVOID lpReserved ) | |
45 | LPVOID lpReserved UNUSED) | |
46 | 46 | { |
47 | 47 | switch (ul_reason_for_call) |
48 | 48 | { |
6 | 6 | uselib += ['SNDFILE'] |
7 | 7 | uselib += ['AVCODEC'] |
8 | 8 | uselib += ['AVFORMAT'] |
9 | uselib += ['SWRESAMPLE'] | |
9 | 10 | uselib += ['AVRESAMPLE'] |
10 | 11 | uselib += ['AVUTIL'] |
11 | 12 | uselib += ['BLAS'] |
0 | #include <aubio.h> | |
1 | #include "utils_tests.h" | |
2 | ||
3 | int main (int argc, char **argv) | |
4 | { | |
5 | sint_t err = 0; | |
6 | ||
7 | if (argc < 3) { | |
8 | err = 2; | |
9 | PRINT_ERR("not enough arguments\n"); | |
10 | PRINT_MSG("usage: %s <input_path> <output_path> [samplerate] [hop_size]\n", argv[0]); | |
11 | return err; | |
12 | } | |
13 | ||
14 | uint_t samplerate = 0; | |
15 | uint_t win_size = 1024; | |
16 | uint_t hop_size = 512; | |
17 | uint_t n_frames = 0, read = 0; | |
18 | ||
19 | char_t *source_path = argv[1]; | |
20 | char_t *sink_path = argv[2]; | |
21 | ||
22 | if ( argc >= 4 ) samplerate = atoi(argv[3]); | |
23 | if ( argc >= 5 ) hop_size = atoi(argv[4]); | |
24 | if ( argc >= 6 ) { | |
25 | err = 2; | |
26 | PRINT_ERR("too many arguments\n"); | |
27 | return err; | |
28 | } | |
29 | ||
30 | fvec_t *vec = new_fvec(hop_size); | |
31 | fvec_t *out = new_fvec(hop_size); // output buffer | |
32 | fvec_t *scale = new_fvec(hop_size); | |
33 | cvec_t *fftgrain = new_cvec(win_size); // fft norm and phase | |
34 | if (!vec) { err = 1; goto beach_fvec; } | |
35 | ||
36 | aubio_source_t *i = new_aubio_source(source_path, samplerate, hop_size); | |
37 | if (!i) { err = 1; goto beach_source; } | |
38 | ||
39 | if (samplerate == 0 ) samplerate = aubio_source_get_samplerate(i); | |
40 | ||
41 | aubio_sink_t *o = new_aubio_sink(sink_path, samplerate); | |
42 | if (!o) { err = 1; goto beach_sink; } | |
43 | ||
44 | aubio_pvoc_t *pv = new_aubio_pvoc(win_size, hop_size); | |
45 | ||
46 | aubio_spectral_whitening_t *awhitening = | |
47 | new_aubio_spectral_whitening (win_size, hop_size, samplerate); | |
48 | ||
49 | aubio_spectral_whitening_set_relax_time(awhitening, 20.); | |
50 | fvec_set_all(scale, 3.); | |
51 | ||
52 | PRINT_MSG("spectral whitening relaxation time is %f\n", | |
53 | aubio_spectral_whitening_get_relax_time(awhitening)); | |
54 | ||
55 | do { | |
56 | aubio_source_do(i, vec, &read); | |
57 | aubio_pvoc_do(pv, vec, fftgrain); | |
58 | // apply spectral whitening | |
59 | aubio_spectral_whitening_do(awhitening, fftgrain); | |
60 | // rebuild the signal | |
61 | aubio_pvoc_rdo(pv, fftgrain, out); | |
62 | // make louder | |
63 | fvec_weight(out, scale); | |
64 | // make sure we dont saturate | |
65 | fvec_clamp(out, 1.); | |
66 | // write output | |
67 | aubio_sink_do(o, out, read); | |
68 | n_frames += read; | |
69 | } while ( read == hop_size ); | |
70 | ||
71 | PRINT_MSG("read %d frames at %dHz (%d blocks) from %s written to %s\n", | |
72 | n_frames, samplerate, n_frames / hop_size, | |
73 | source_path, sink_path); | |
74 | ||
75 | del_aubio_sink(o); | |
76 | beach_sink: | |
77 | del_aubio_source(i); | |
78 | beach_source: | |
79 | del_fvec(vec); | |
80 | beach_fvec: | |
81 | return err; | |
82 | } | |
83 |
0 | #! python | |
1 | import os | |
2 | import sys | |
3 | ||
4 | __version_info = {} # keep a reference to parse VERSION once | |
5 | ||
6 | def get_version_info(): | |
7 | # read from VERSION | |
8 | # return dictionary filled with content of version | |
9 | if not __version_info: | |
10 | this_file_dir = os.path.dirname(os.path.abspath(__file__)) | |
11 | version_file = os.path.join(this_file_dir, 'VERSION') | |
12 | ||
13 | if not os.path.isfile(version_file): | |
14 | raise SystemError("VERSION file not found.") | |
15 | ||
16 | for l in open(version_file).readlines(): | |
17 | if l.startswith('AUBIO_MAJOR_VERSION'): | |
18 | __version_info['AUBIO_MAJOR_VERSION'] = int(l.split('=')[1]) | |
19 | if l.startswith('AUBIO_MINOR_VERSION'): | |
20 | __version_info['AUBIO_MINOR_VERSION'] = int(l.split('=')[1]) | |
21 | if l.startswith('AUBIO_PATCH_VERSION'): | |
22 | __version_info['AUBIO_PATCH_VERSION'] = int(l.split('=')[1]) | |
23 | if l.startswith('AUBIO_VERSION_STATUS'): | |
24 | __version_info['AUBIO_VERSION_STATUS'] = \ | |
25 | l.split('=')[1].strip()[1:-1] | |
26 | ||
27 | if l.startswith('LIBAUBIO_LT_CUR'): | |
28 | __version_info['LIBAUBIO_LT_CUR'] = int(l.split('=')[1]) | |
29 | if l.startswith('LIBAUBIO_LT_REV'): | |
30 | __version_info['LIBAUBIO_LT_REV'] = int(l.split('=')[1]) | |
31 | if l.startswith('LIBAUBIO_LT_AGE'): | |
32 | __version_info['LIBAUBIO_LT_AGE'] = int(l.split('=')[1]) | |
33 | ||
34 | if len(__version_info) < 6: | |
35 | raise SystemError("Failed parsing VERSION file.") | |
36 | ||
37 | # switch version status with commit sha in alpha releases | |
38 | if __version_info['AUBIO_VERSION_STATUS'] and \ | |
39 | '~alpha' in __version_info['AUBIO_VERSION_STATUS']: | |
40 | AUBIO_GIT_SHA = get_git_revision_hash() | |
41 | if AUBIO_GIT_SHA: | |
42 | __version_info['AUBIO_VERSION_STATUS'] = '~git+' + AUBIO_GIT_SHA | |
43 | ||
44 | return __version_info | |
45 | ||
46 | def get_libaubio_version(): | |
47 | verfmt = '%(LIBAUBIO_LT_CUR)s.%(LIBAUBIO_LT_REV)s.%(LIBAUBIO_LT_AGE)s' | |
48 | return str(verfmt % get_version_info()) | |
49 | ||
50 | def get_aubio_version(): | |
51 | verfmt = '%(AUBIO_MAJOR_VERSION)s.%(AUBIO_MINOR_VERSION)s.%(AUBIO_PATCH_VERSION)s%(AUBIO_VERSION_STATUS)s' | |
52 | return str(verfmt % get_version_info()) | |
53 | ||
54 | def get_aubio_pyversion(): | |
55 | # convert to version for python according to pep 440 | |
56 | # see https://www.python.org/dev/peps/pep-0440/ | |
57 | # outputs MAJ.MIN.PATCH[a0[+git.<sha>[.mods]]] | |
58 | aubio_version = get_aubio_version() | |
59 | if '~git+' in aubio_version: | |
60 | pep440str = aubio_version.replace('+', '.') | |
61 | verstr = pep440str.replace('~git.', 'a0+') | |
62 | elif '~alpha' in aubio_version: | |
63 | verstr = aubio_version.replace('~alpha', 'a0') | |
64 | else: | |
65 | verstr = aubio_version | |
66 | return verstr | |
67 | ||
68 | def get_git_revision_hash(short=True): | |
69 | # get commit id, with +mods if local tree is not clean | |
70 | if not os.path.isdir('.git'): | |
71 | # print('Version : not in git repository : can\'t get sha') | |
72 | return None | |
73 | import subprocess | |
74 | aubio_dir = os.path.dirname(os.path.abspath(__file__)) | |
75 | if not os.path.exists(aubio_dir): | |
76 | raise SystemError("git / root folder not found") | |
77 | gitcmd = ['git', '-C', aubio_dir, 'rev-parse'] | |
78 | if short: | |
79 | gitcmd.append('--short') | |
80 | gitcmd.append('HEAD') | |
81 | try: | |
82 | gitsha = subprocess.check_output(gitcmd).strip().decode('utf8') | |
83 | except Exception as e: | |
84 | sys.stderr.write('git command error :%s\n' % e) | |
85 | return None | |
86 | # check if we have a clean tree | |
87 | gitcmd = ['git', '-C', aubio_dir, 'status', '--porcelain'] | |
88 | try: | |
89 | output = subprocess.check_output(gitcmd).decode('utf8') | |
90 | if len(output): | |
91 | sys.stderr.write('Info: current tree is not clean\n\n') | |
92 | sys.stderr.write(output + '\n') | |
93 | gitsha += '+mods' | |
94 | except subprocess.CalledProcessError as e: | |
95 | sys.stderr.write('git command error :%s\n' % e) | |
96 | pass | |
97 | return gitsha | |
98 | ||
99 | if __name__ == '__main__': | |
100 | if len(sys.argv) > 1 and sys.argv[1] == '-v': | |
101 | print (get_aubio_version()) | |
102 | elif len(sys.argv) > 1 and sys.argv[1] == '-p': | |
103 | print (get_aubio_version()) | |
104 | else: | |
105 | print ('%30s'% 'aubio version:', get_aubio_version()) | |
106 | print ('%30s'% 'python-aubio version:', get_aubio_pyversion()) |
0 | 0 | #!/usr/bin/env python |
1 | 1 | # encoding: ISO8859-1 |
2 | 2 | # Thomas Nagy, 2005-2016 |
3 | ||
3 | # | |
4 | 4 | """ |
5 | 5 | Redistribution and use in source and binary forms, with or without |
6 | 6 | modification, are permitted provided that the following conditions |
31 | 31 | |
32 | 32 | import os, sys, inspect |
33 | 33 | |
34 | VERSION="1.8.22" | |
35 | REVISION="596301b77b6d6efab064109ecd67cd79" | |
36 | GIT="129ec0cfe7dc9d2880e085fab2a449c651ac8285" | |
34 | VERSION="1.9.6" | |
35 | REVISION="1e8548ddb990ceb895e5c7946e836f9f" | |
36 | GIT="0bca3987addc676be6079d46df87408e0f8abba8" | |
37 | 37 | INSTALL='' |
38 | C1='#.' | |
39 | C2='#,' | |
38 | C1='#>' | |
39 | C2='#;' | |
40 | 40 | C3='#&' |
41 | 41 | cwd = os.getcwd() |
42 | 42 | join = os.path.join |
6 | 6 | import cPickle |
7 | 7 | except ImportError: |
8 | 8 | import pickle as cPickle |
9 | from waflib import Runner,TaskGen,Utils,ConfigSet,Task,Logs,Options,Context,Errors | |
10 | import waflib.Node | |
9 | from waflib import Node,Runner,TaskGen,Utils,ConfigSet,Task,Logs,Options,Context,Errors | |
11 | 10 | CACHE_DIR='c4che' |
12 | 11 | CACHE_SUFFIX='_cache.py' |
13 | 12 | INSTALL=1337 |
14 | 13 | UNINSTALL=-1337 |
15 | SAVED_ATTRS='root node_deps raw_deps task_sigs'.split() | |
14 | SAVED_ATTRS='root node_sigs task_sigs imp_sigs raw_deps node_deps'.split() | |
16 | 15 | CFG_FILES='cfg_files' |
17 | 16 | POST_AT_ONCE=0 |
18 | 17 | POST_LAZY=1 |
19 | POST_BOTH=2 | |
20 | 18 | PROTOCOL=-1 |
21 | 19 | if sys.platform=='cli': |
22 | 20 | PROTOCOL=0 |
28 | 26 | super(BuildContext,self).__init__(**kw) |
29 | 27 | self.is_install=0 |
30 | 28 | self.top_dir=kw.get('top_dir',Context.top_dir) |
29 | self.out_dir=kw.get('out_dir',Context.out_dir) | |
31 | 30 | self.run_dir=kw.get('run_dir',Context.run_dir) |
32 | self.post_mode=POST_AT_ONCE | |
33 | self.out_dir=kw.get('out_dir',Context.out_dir) | |
34 | self.cache_dir=kw.get('cache_dir',None) | |
31 | self.launch_dir=Context.launch_dir | |
32 | self.post_mode=POST_LAZY | |
33 | self.cache_dir=kw.get('cache_dir') | |
35 | 34 | if not self.cache_dir: |
36 | 35 | self.cache_dir=os.path.join(self.out_dir,CACHE_DIR) |
37 | 36 | self.all_envs={} |
37 | self.node_sigs={} | |
38 | 38 | self.task_sigs={} |
39 | self.imp_sigs={} | |
39 | 40 | self.node_deps={} |
40 | 41 | self.raw_deps={} |
41 | self.cache_dir_contents={} | |
42 | 42 | self.task_gen_cache_names={} |
43 | self.launch_dir=Context.launch_dir | |
44 | 43 | self.jobs=Options.options.jobs |
45 | 44 | self.targets=Options.options.targets |
46 | 45 | self.keep=Options.options.keep |
49 | 48 | self.current_group=0 |
50 | 49 | self.groups=[] |
51 | 50 | self.group_names={} |
51 | for v in SAVED_ATTRS: | |
52 | if not hasattr(self,v): | |
53 | setattr(self,v,{}) | |
52 | 54 | def get_variant_dir(self): |
53 | 55 | if not self.variant: |
54 | 56 | return self.out_dir |
58 | 60 | kw['bld']=self |
59 | 61 | ret=TaskGen.task_gen(*k,**kw) |
60 | 62 | self.task_gen_cache_names={} |
61 | self.add_to_group(ret,group=kw.get('group',None)) | |
63 | self.add_to_group(ret,group=kw.get('group')) | |
62 | 64 | return ret |
63 | 65 | def rule(self,*k,**kw): |
64 | 66 | def f(rule): |
67 | 69 | return ret |
68 | 70 | return f |
69 | 71 | def __copy__(self): |
70 | raise Errors.WafError('build contexts are not supposed to be copied') | |
71 | def install_files(self,*k,**kw): | |
72 | pass | |
73 | def install_as(self,*k,**kw): | |
74 | pass | |
75 | def symlink_as(self,*k,**kw): | |
76 | pass | |
72 | raise Errors.WafError('build contexts cannot be copied') | |
77 | 73 | def load_envs(self): |
78 | 74 | node=self.root.find_node(self.cache_dir) |
79 | 75 | if not node: |
87 | 83 | self.all_envs[name]=env |
88 | 84 | for f in env[CFG_FILES]: |
89 | 85 | newnode=self.root.find_resource(f) |
90 | try: | |
91 | h=Utils.h_file(newnode.abspath()) | |
92 | except(IOError,AttributeError): | |
93 | Logs.error('cannot find %r'%f) | |
94 | h=Utils.SIG_NIL | |
95 | newnode.sig=h | |
86 | if not newnode or not newnode.exists(): | |
87 | raise Errors.WafError('Missing configuration file %r, reconfigure the project!'%f) | |
96 | 88 | def init_dirs(self): |
97 | 89 | if not(os.path.isabs(self.top_dir)and os.path.isabs(self.out_dir)): |
98 | 90 | raise Errors.WafError('The project was not configured: run "waf configure" first!') |
105 | 97 | self.load_envs() |
106 | 98 | self.execute_build() |
107 | 99 | def execute_build(self): |
108 | Logs.info("Waf: Entering directory `%s'"%self.variant_dir) | |
100 | Logs.info("Waf: Entering directory `%s'",self.variant_dir) | |
109 | 101 | self.recurse([self.run_dir]) |
110 | 102 | self.pre_build() |
111 | 103 | self.timer=Utils.Timer() |
113 | 105 | self.compile() |
114 | 106 | finally: |
115 | 107 | if self.progress_bar==1 and sys.stderr.isatty(): |
116 | c=len(self.returned_tasks)or 1 | |
108 | c=self.producer.processed or 1 | |
117 | 109 | m=self.progress_line(c,c,Logs.colors.BLUE,Logs.colors.NORMAL) |
118 | 110 | Logs.info(m,extra={'stream':sys.stderr,'c1':Logs.colors.cursor_off,'c2':Logs.colors.cursor_on}) |
119 | Logs.info("Waf: Leaving directory `%s'"%self.variant_dir) | |
111 | Logs.info("Waf: Leaving directory `%s'",self.variant_dir) | |
112 | try: | |
113 | self.producer.bld=None | |
114 | del self.producer | |
115 | except AttributeError: | |
116 | pass | |
120 | 117 | self.post_build() |
121 | 118 | def restore(self): |
122 | 119 | try: |
124 | 121 | except EnvironmentError: |
125 | 122 | pass |
126 | 123 | else: |
127 | if env['version']<Context.HEXVERSION: | |
124 | if env.version<Context.HEXVERSION: | |
128 | 125 | raise Errors.WafError('Version mismatch! reconfigure the project') |
129 | for t in env['tools']: | |
126 | for t in env.tools: | |
130 | 127 | self.setup(**t) |
131 | 128 | dbfn=os.path.join(self.variant_dir,Context.DBFILE) |
132 | 129 | try: |
133 | 130 | data=Utils.readf(dbfn,'rb') |
134 | except(IOError,EOFError): | |
135 | Logs.debug('build: Could not load the build cache %s (missing)'%dbfn) | |
131 | except(EnvironmentError,EOFError): | |
132 | Logs.debug('build: Could not load the build cache %s (missing)',dbfn) | |
136 | 133 | else: |
137 | 134 | try: |
138 | waflib.Node.pickle_lock.acquire() | |
139 | waflib.Node.Nod3=self.node_class | |
135 | Node.pickle_lock.acquire() | |
136 | Node.Nod3=self.node_class | |
140 | 137 | try: |
141 | 138 | data=cPickle.loads(data) |
142 | 139 | except Exception as e: |
143 | Logs.debug('build: Could not pickle the build cache %s: %r'%(dbfn,e)) | |
140 | Logs.debug('build: Could not pickle the build cache %s: %r',dbfn,e) | |
144 | 141 | else: |
145 | 142 | for x in SAVED_ATTRS: |
146 | setattr(self,x,data[x]) | |
143 | setattr(self,x,data.get(x,{})) | |
147 | 144 | finally: |
148 | waflib.Node.pickle_lock.release() | |
145 | Node.pickle_lock.release() | |
149 | 146 | self.init_dirs() |
150 | 147 | def store(self): |
151 | 148 | data={} |
153 | 150 | data[x]=getattr(self,x) |
154 | 151 | db=os.path.join(self.variant_dir,Context.DBFILE) |
155 | 152 | try: |
156 | waflib.Node.pickle_lock.acquire() | |
157 | waflib.Node.Nod3=self.node_class | |
153 | Node.pickle_lock.acquire() | |
154 | Node.Nod3=self.node_class | |
158 | 155 | x=cPickle.dumps(data,PROTOCOL) |
159 | 156 | finally: |
160 | waflib.Node.pickle_lock.release() | |
157 | Node.pickle_lock.release() | |
161 | 158 | Utils.writef(db+'.tmp',x,m='wb') |
162 | 159 | try: |
163 | 160 | st=os.stat(db) |
171 | 168 | Logs.debug('build: compile()') |
172 | 169 | self.producer=Runner.Parallel(self,self.jobs) |
173 | 170 | self.producer.biter=self.get_build_iterator() |
174 | self.returned_tasks=[] | |
175 | 171 | try: |
176 | 172 | self.producer.start() |
177 | 173 | except KeyboardInterrupt: |
197 | 193 | self.all_envs[self.variant]=val |
198 | 194 | env=property(get_env,set_env) |
199 | 195 | def add_manual_dependency(self,path,value): |
200 | if path is None: | |
201 | raise ValueError('Invalid input') | |
202 | if isinstance(path,waflib.Node.Node): | |
196 | if not path: | |
197 | raise ValueError('Invalid input path %r'%path) | |
198 | if isinstance(path,Node.Node): | |
203 | 199 | node=path |
204 | 200 | elif os.path.isabs(path): |
205 | 201 | node=self.root.find_resource(path) |
206 | 202 | else: |
207 | 203 | node=self.path.find_resource(path) |
204 | if not node: | |
205 | raise ValueError('Could not find the path %r'%path) | |
208 | 206 | if isinstance(value,list): |
209 | self.deps_man[id(node)].extend(value) | |
210 | else: | |
211 | self.deps_man[id(node)].append(value) | |
207 | self.deps_man[node].extend(value) | |
208 | else: | |
209 | self.deps_man[node].append(value) | |
212 | 210 | def launch_node(self): |
213 | 211 | try: |
214 | 212 | return self.p_ln |
231 | 229 | except KeyError: |
232 | 230 | pass |
233 | 231 | lst=[env[a]for a in vars_lst] |
234 | ret=Utils.h_list(lst) | |
232 | cache[idx]=ret=Utils.h_list(lst) | |
235 | 233 | Logs.debug('envhash: %s %r',Utils.to_hex(ret),lst) |
236 | cache[idx]=ret | |
237 | 234 | return ret |
238 | 235 | def get_tgen_by_name(self,name): |
239 | 236 | cache=self.task_gen_cache_names |
248 | 245 | return cache[name] |
249 | 246 | except KeyError: |
250 | 247 | raise Errors.WafError('Could not find a task generator for the name %r'%name) |
251 | def progress_line(self,state,total,col1,col2): | |
248 | def progress_line(self,idx,total,col1,col2): | |
252 | 249 | if not sys.stderr.isatty(): |
253 | 250 | return'' |
254 | 251 | n=len(str(total)) |
255 | 252 | Utils.rot_idx+=1 |
256 | 253 | ind=Utils.rot_chr[Utils.rot_idx%4] |
257 | pc=(100.*state)/total | |
258 | eta=str(self.timer) | |
259 | fs="[%%%dd/%%%dd][%%s%%2d%%%%%%s][%s]["%(n,n,ind) | |
260 | left=fs%(state,total,col1,pc,col2) | |
261 | right='][%s%s%s]'%(col1,eta,col2) | |
254 | pc=(100.*idx)/total | |
255 | fs="[%%%dd/%%d][%%s%%2d%%%%%%s][%s]["%(n,ind) | |
256 | left=fs%(idx,total,col1,pc,col2) | |
257 | right='][%s%s%s]'%(col1,self.timer,col2) | |
262 | 258 | cols=Logs.get_term_cols()-len(left)-len(right)+2*len(col1)+2*len(col2) |
263 | 259 | if cols<7:cols=7 |
264 | ratio=((cols*state)//total)-1 | |
260 | ratio=((cols*idx)//total)-1 | |
265 | 261 | bar=('='*ratio+'>').ljust(cols) |
266 | 262 | msg=Logs.indicator%(left,bar,right) |
267 | 263 | return msg |
304 | 300 | return'' |
305 | 301 | def get_group_idx(self,tg): |
306 | 302 | se=id(tg) |
307 | for i in range(len(self.groups)): | |
308 | for t in self.groups[i]: | |
303 | for i,tmp in enumerate(self.groups): | |
304 | for t in tmp: | |
309 | 305 | if id(t)==se: |
310 | 306 | return i |
311 | 307 | return None |
312 | 308 | def add_group(self,name=None,move=True): |
313 | 309 | if name and name in self.group_names: |
314 | Logs.error('add_group: name %s already present'%name) | |
310 | raise Errors.WafError('add_group: name %s already present',name) | |
315 | 311 | g=[] |
316 | 312 | self.group_names[name]=g |
317 | 313 | self.groups.append(g) |
320 | 316 | def set_group(self,idx): |
321 | 317 | if isinstance(idx,str): |
322 | 318 | g=self.group_names[idx] |
323 | for i in range(len(self.groups)): | |
324 | if id(g)==id(self.groups[i]): | |
319 | for i,tmp in enumerate(self.groups): | |
320 | if id(g)==id(tmp): | |
325 | 321 | self.current_group=i |
326 | 322 | break |
327 | 323 | else: |
379 | 375 | Logs.warn('Building from the build directory, forcing --targets=*') |
380 | 376 | ln=self.srcnode |
381 | 377 | elif not ln.is_child_of(self.srcnode): |
382 | Logs.warn('CWD %s is not under %s, forcing --targets=* (run distclean?)'%(ln.abspath(),self.srcnode.abspath())) | |
378 | Logs.warn('CWD %s is not under %s, forcing --targets=* (run distclean?)',ln.abspath(),self.srcnode.abspath()) | |
383 | 379 | ln=self.srcnode |
384 | 380 | for tg in self.groups[self.cur]: |
385 | 381 | try: |
420 | 416 | yield tasks |
421 | 417 | while 1: |
422 | 418 | yield[] |
419 | def install_files(self,dest,files,**kw): | |
420 | assert(dest) | |
421 | tg=self(features='install_task',install_to=dest,install_from=files,**kw) | |
422 | tg.dest=tg.install_to | |
423 | tg.type='install_files' | |
424 | if not kw.get('postpone',True): | |
425 | tg.post() | |
426 | return tg | |
427 | def install_as(self,dest,srcfile,**kw): | |
428 | assert(dest) | |
429 | tg=self(features='install_task',install_to=dest,install_from=srcfile,**kw) | |
430 | tg.dest=tg.install_to | |
431 | tg.type='install_as' | |
432 | if not kw.get('postpone',True): | |
433 | tg.post() | |
434 | return tg | |
435 | def symlink_as(self,dest,src,**kw): | |
436 | assert(dest) | |
437 | tg=self(features='install_task',install_to=dest,install_from=src,**kw) | |
438 | tg.dest=tg.install_to | |
439 | tg.type='symlink_as' | |
440 | tg.link=src | |
441 | if not kw.get('postpone',True): | |
442 | tg.post() | |
443 | return tg | |
444 | @TaskGen.feature('install_task') | |
445 | @TaskGen.before_method('process_rule','process_source') | |
446 | def process_install_task(self): | |
447 | self.add_install_task(**self.__dict__) | |
448 | @TaskGen.taskgen_method | |
449 | def add_install_task(self,**kw): | |
450 | if not self.bld.is_install: | |
451 | return | |
452 | if not kw['install_to']: | |
453 | return | |
454 | if kw['type']=='symlink_as'and Utils.is_win32: | |
455 | if kw.get('win32_install'): | |
456 | kw['type']='install_as' | |
457 | else: | |
458 | return | |
459 | tsk=self.install_task=self.create_task('inst') | |
460 | tsk.chmod=kw.get('chmod',Utils.O644) | |
461 | tsk.link=kw.get('link','')or kw.get('install_from','') | |
462 | tsk.relative_trick=kw.get('relative_trick',False) | |
463 | tsk.type=kw['type'] | |
464 | tsk.install_to=tsk.dest=kw['install_to'] | |
465 | tsk.install_from=kw['install_from'] | |
466 | tsk.relative_base=kw.get('cwd')or kw.get('relative_base',self.path) | |
467 | tsk.install_user=kw.get('install_user') | |
468 | tsk.install_group=kw.get('install_group') | |
469 | tsk.init_files() | |
470 | if not kw.get('postpone',True): | |
471 | tsk.run_now() | |
472 | return tsk | |
473 | @TaskGen.taskgen_method | |
474 | def add_install_files(self,**kw): | |
475 | kw['type']='install_files' | |
476 | return self.add_install_task(**kw) | |
477 | @TaskGen.taskgen_method | |
478 | def add_install_as(self,**kw): | |
479 | kw['type']='install_as' | |
480 | return self.add_install_task(**kw) | |
481 | @TaskGen.taskgen_method | |
482 | def add_symlink_as(self,**kw): | |
483 | kw['type']='symlink_as' | |
484 | return self.add_install_task(**kw) | |
423 | 485 | class inst(Task.Task): |
424 | color='CYAN' | |
486 | def __str__(self): | |
487 | return'' | |
425 | 488 | def uid(self): |
426 | lst=[self.dest,self.path]+self.source | |
427 | return Utils.h_list(repr(lst)) | |
428 | def post(self): | |
429 | buf=[] | |
430 | for x in self.source: | |
431 | if isinstance(x,waflib.Node.Node): | |
432 | y=x | |
433 | else: | |
434 | y=self.path.find_resource(x) | |
435 | if not y: | |
436 | if os.path.isabs(x): | |
437 | y=self.bld.root.make_node(x) | |
438 | else: | |
439 | y=self.path.make_node(x) | |
440 | buf.append(y) | |
441 | self.inputs=buf | |
489 | lst=self.inputs+self.outputs+[self.link,self.generator.path.abspath()] | |
490 | return Utils.h_list(lst) | |
491 | def init_files(self): | |
492 | if self.type=='symlink_as': | |
493 | inputs=[] | |
494 | else: | |
495 | inputs=self.generator.to_nodes(self.install_from) | |
496 | if self.type=='install_as': | |
497 | assert len(inputs)==1 | |
498 | self.set_inputs(inputs) | |
499 | dest=self.get_install_path() | |
500 | outputs=[] | |
501 | if self.type=='symlink_as': | |
502 | if self.relative_trick: | |
503 | self.link=os.path.relpath(self.link,os.path.dirname(dest)) | |
504 | outputs.append(self.generator.bld.root.make_node(dest)) | |
505 | elif self.type=='install_as': | |
506 | outputs.append(self.generator.bld.root.make_node(dest)) | |
507 | else: | |
508 | for y in inputs: | |
509 | if self.relative_trick: | |
510 | destfile=os.path.join(dest,y.path_from(self.relative_base)) | |
511 | else: | |
512 | destfile=os.path.join(dest,y.name) | |
513 | outputs.append(self.generator.bld.root.make_node(destfile)) | |
514 | self.set_outputs(outputs) | |
442 | 515 | def runnable_status(self): |
443 | 516 | ret=super(inst,self).runnable_status() |
444 | if ret==Task.SKIP_ME: | |
517 | if ret==Task.SKIP_ME and self.generator.bld.is_install: | |
445 | 518 | return Task.RUN_ME |
446 | 519 | return ret |
447 | def __str__(self): | |
448 | return'' | |
449 | def run(self): | |
450 | return self.generator.exec_task() | |
520 | def post_run(self): | |
521 | pass | |
451 | 522 | def get_install_path(self,destdir=True): |
452 | dest=Utils.subst_vars(self.dest,self.env) | |
453 | dest=dest.replace('/',os.sep) | |
523 | if isinstance(self.install_to,Node.Node): | |
524 | dest=self.install_to.abspath() | |
525 | else: | |
526 | dest=Utils.subst_vars(self.install_to,self.env) | |
454 | 527 | if destdir and Options.options.destdir: |
455 | 528 | dest=os.path.join(Options.options.destdir,os.path.splitdrive(dest)[1].lstrip(os.sep)) |
456 | 529 | return dest |
457 | def exec_install_files(self): | |
458 | destpath=self.get_install_path() | |
459 | if not destpath: | |
460 | raise Errors.WafError('unknown installation path %r'%self.generator) | |
461 | for x,y in zip(self.source,self.inputs): | |
462 | if self.relative_trick: | |
463 | destfile=os.path.join(destpath,y.path_from(self.path)) | |
464 | else: | |
465 | destfile=os.path.join(destpath,y.name) | |
466 | self.generator.bld.do_install(y.abspath(),destfile,chmod=self.chmod,tsk=self) | |
467 | def exec_install_as(self): | |
468 | destfile=self.get_install_path() | |
469 | self.generator.bld.do_install(self.inputs[0].abspath(),destfile,chmod=self.chmod,tsk=self) | |
470 | def exec_symlink_as(self): | |
471 | destfile=self.get_install_path() | |
472 | src=self.link | |
473 | if self.relative_trick: | |
474 | src=os.path.relpath(src,os.path.dirname(destfile)) | |
475 | self.generator.bld.do_link(src,destfile,tsk=self) | |
476 | class InstallContext(BuildContext): | |
477 | '''installs the targets on the system''' | |
478 | cmd='install' | |
479 | def __init__(self,**kw): | |
480 | super(InstallContext,self).__init__(**kw) | |
481 | self.uninstall=[] | |
482 | self.is_install=INSTALL | |
483 | def copy_fun(self,src,tgt,**kw): | |
530 | def copy_fun(self,src,tgt): | |
484 | 531 | if Utils.is_win32 and len(tgt)>259 and not tgt.startswith('\\\\?\\'): |
485 | 532 | tgt='\\\\?\\'+tgt |
486 | 533 | shutil.copy2(src,tgt) |
487 | os.chmod(tgt,kw.get('chmod',Utils.O644)) | |
488 | def do_install(self,src,tgt,**kw): | |
489 | d,_=os.path.split(tgt) | |
490 | if not d: | |
491 | raise Errors.WafError('Invalid installation given %r->%r'%(src,tgt)) | |
492 | Utils.check_dir(d) | |
493 | srclbl=src.replace(self.srcnode.abspath()+os.sep,'') | |
534 | self.fix_perms(tgt) | |
535 | def rm_empty_dirs(self,tgt): | |
536 | while tgt: | |
537 | tgt=os.path.dirname(tgt) | |
538 | try: | |
539 | os.rmdir(tgt) | |
540 | except OSError: | |
541 | break | |
542 | def run(self): | |
543 | is_install=self.generator.bld.is_install | |
544 | if not is_install: | |
545 | return | |
546 | for x in self.outputs: | |
547 | if is_install==INSTALL: | |
548 | x.parent.mkdir() | |
549 | if self.type=='symlink_as': | |
550 | fun=is_install==INSTALL and self.do_link or self.do_unlink | |
551 | fun(self.link,self.outputs[0].abspath()) | |
552 | else: | |
553 | fun=is_install==INSTALL and self.do_install or self.do_uninstall | |
554 | launch_node=self.generator.bld.launch_node() | |
555 | for x,y in zip(self.inputs,self.outputs): | |
556 | fun(x.abspath(),y.abspath(),x.path_from(launch_node)) | |
557 | def run_now(self): | |
558 | status=self.runnable_status() | |
559 | if status not in(Task.RUN_ME,Task.SKIP_ME): | |
560 | raise Errors.TaskNotReady('Could not process %r: status %r'%(self,status)) | |
561 | self.run() | |
562 | self.hasrun=Task.SUCCESS | |
563 | def do_install(self,src,tgt,lbl,**kw): | |
494 | 564 | if not Options.options.force: |
495 | 565 | try: |
496 | 566 | st1=os.stat(tgt) |
499 | 569 | pass |
500 | 570 | else: |
501 | 571 | if st1.st_mtime+2>=st2.st_mtime and st1.st_size==st2.st_size: |
502 | if not self.progress_bar: | |
503 | Logs.info('- install %s (from %s)'%(tgt,srclbl)) | |
572 | if not self.generator.bld.progress_bar: | |
573 | Logs.info('- install %s (from %s)',tgt,lbl) | |
504 | 574 | return False |
505 | if not self.progress_bar: | |
506 | Logs.info('+ install %s (from %s)'%(tgt,srclbl)) | |
575 | if not self.generator.bld.progress_bar: | |
576 | Logs.info('+ install %s (from %s)',tgt,lbl) | |
507 | 577 | try: |
508 | 578 | os.chmod(tgt,Utils.O644|stat.S_IMODE(os.stat(tgt).st_mode)) |
509 | 579 | except EnvironmentError: |
513 | 583 | except OSError: |
514 | 584 | pass |
515 | 585 | try: |
516 | self.copy_fun(src,tgt,**kw) | |
517 | except IOError: | |
586 | self.copy_fun(src,tgt) | |
587 | except EnvironmentError as e: | |
588 | if not os.path.exists(src): | |
589 | Logs.error('File %r does not exist',src) | |
590 | elif not os.path.isfile(src): | |
591 | Logs.error('Input %r is not a file',src) | |
592 | raise Errors.WafError('Could not install the file %r'%tgt,e) | |
593 | def fix_perms(self,tgt): | |
594 | if not Utils.is_win32: | |
595 | user=getattr(self,'install_user',None)or getattr(self.generator,'install_user',None) | |
596 | group=getattr(self,'install_group',None)or getattr(self.generator,'install_group',None) | |
597 | if user or group: | |
598 | Utils.lchown(tgt,user or-1,group or-1) | |
599 | if not os.path.islink(tgt): | |
600 | os.chmod(tgt,self.chmod) | |
601 | def do_link(self,src,tgt,**kw): | |
602 | if os.path.islink(tgt)and os.readlink(tgt)==src: | |
603 | if not self.generator.bld.progress_bar: | |
604 | Logs.info('- symlink %s (to %s)',tgt,src) | |
605 | else: | |
518 | 606 | try: |
519 | os.stat(src) | |
520 | except EnvironmentError: | |
521 | Logs.error('File %r does not exist'%src) | |
522 | raise Errors.WafError('Could not install the file %r'%tgt) | |
523 | def do_link(self,src,tgt,**kw): | |
524 | d,_=os.path.split(tgt) | |
525 | Utils.check_dir(d) | |
526 | link=False | |
527 | if not os.path.islink(tgt): | |
528 | link=True | |
529 | elif os.readlink(tgt)!=src: | |
530 | link=True | |
531 | if link: | |
532 | try:os.remove(tgt) | |
533 | except OSError:pass | |
534 | if not self.progress_bar: | |
535 | Logs.info('+ symlink %s (to %s)'%(tgt,src)) | |
607 | os.remove(tgt) | |
608 | except OSError: | |
609 | pass | |
610 | if not self.generator.bld.progress_bar: | |
611 | Logs.info('+ symlink %s (to %s)',tgt,src) | |
536 | 612 | os.symlink(src,tgt) |
537 | else: | |
538 | if not self.progress_bar: | |
539 | Logs.info('- symlink %s (to %s)'%(tgt,src)) | |
540 | def run_task_now(self,tsk,postpone): | |
541 | tsk.post() | |
542 | if not postpone: | |
543 | if tsk.runnable_status()==Task.ASK_LATER: | |
544 | raise self.WafError('cannot post the task %r'%tsk) | |
545 | tsk.run() | |
546 | tsk.hasrun=True | |
547 | def install_files(self,dest,files,env=None,chmod=Utils.O644,relative_trick=False,cwd=None,add=True,postpone=True,task=None): | |
548 | assert(dest) | |
549 | tsk=inst(env=env or self.env) | |
550 | tsk.bld=self | |
551 | tsk.path=cwd or self.path | |
552 | tsk.chmod=chmod | |
553 | tsk.task=task | |
554 | if isinstance(files,waflib.Node.Node): | |
555 | tsk.source=[files] | |
556 | else: | |
557 | tsk.source=Utils.to_list(files) | |
558 | tsk.dest=dest | |
559 | tsk.exec_task=tsk.exec_install_files | |
560 | tsk.relative_trick=relative_trick | |
561 | if add:self.add_to_group(tsk) | |
562 | self.run_task_now(tsk,postpone) | |
563 | return tsk | |
564 | def install_as(self,dest,srcfile,env=None,chmod=Utils.O644,cwd=None,add=True,postpone=True,task=None): | |
565 | assert(dest) | |
566 | tsk=inst(env=env or self.env) | |
567 | tsk.bld=self | |
568 | tsk.path=cwd or self.path | |
569 | tsk.chmod=chmod | |
570 | tsk.source=[srcfile] | |
571 | tsk.task=task | |
572 | tsk.dest=dest | |
573 | tsk.exec_task=tsk.exec_install_as | |
574 | if add:self.add_to_group(tsk) | |
575 | self.run_task_now(tsk,postpone) | |
576 | return tsk | |
577 | def symlink_as(self,dest,src,env=None,cwd=None,add=True,postpone=True,relative_trick=False,task=None): | |
578 | if Utils.is_win32: | |
579 | return | |
580 | assert(dest) | |
581 | tsk=inst(env=env or self.env) | |
582 | tsk.bld=self | |
583 | tsk.dest=dest | |
584 | tsk.path=cwd or self.path | |
585 | tsk.source=[] | |
586 | tsk.task=task | |
587 | tsk.link=src | |
588 | tsk.relative_trick=relative_trick | |
589 | tsk.exec_task=tsk.exec_symlink_as | |
590 | if add:self.add_to_group(tsk) | |
591 | self.run_task_now(tsk,postpone) | |
592 | return tsk | |
593 | class UninstallContext(InstallContext): | |
594 | '''removes the targets installed''' | |
595 | cmd='uninstall' | |
596 | def __init__(self,**kw): | |
597 | super(UninstallContext,self).__init__(**kw) | |
598 | self.is_install=UNINSTALL | |
599 | def rm_empty_dirs(self,tgt): | |
600 | while tgt: | |
601 | tgt=os.path.dirname(tgt) | |
602 | try: | |
603 | os.rmdir(tgt) | |
604 | except OSError: | |
605 | break | |
606 | def do_install(self,src,tgt,**kw): | |
607 | if not self.progress_bar: | |
608 | Logs.info('- remove %s'%tgt) | |
609 | self.uninstall.append(tgt) | |
613 | self.fix_perms(tgt) | |
614 | def do_uninstall(self,src,tgt,lbl,**kw): | |
615 | if not self.generator.bld.progress_bar: | |
616 | Logs.info('- remove %s',tgt) | |
610 | 617 | try: |
611 | 618 | os.remove(tgt) |
612 | 619 | except OSError as e: |
615 | 622 | self.uninstall_error=True |
616 | 623 | Logs.warn('build: some files could not be uninstalled (retry with -vv to list them)') |
617 | 624 | if Logs.verbose>1: |
618 | Logs.warn('Could not remove %s (error code %r)'%(e.filename,e.errno)) | |
625 | Logs.warn('Could not remove %s (error code %r)',e.filename,e.errno) | |
619 | 626 | self.rm_empty_dirs(tgt) |
620 | def do_link(self,src,tgt,**kw): | |
621 | try: | |
622 | if not self.progress_bar: | |
623 | Logs.info('- remove %s'%tgt) | |
627 | def do_unlink(self,src,tgt,**kw): | |
628 | try: | |
629 | if not self.generator.bld.progress_bar: | |
630 | Logs.info('- remove %s',tgt) | |
624 | 631 | os.remove(tgt) |
625 | 632 | except OSError: |
626 | 633 | pass |
627 | 634 | self.rm_empty_dirs(tgt) |
635 | class InstallContext(BuildContext): | |
636 | '''installs the targets on the system''' | |
637 | cmd='install' | |
638 | def __init__(self,**kw): | |
639 | super(InstallContext,self).__init__(**kw) | |
640 | self.is_install=INSTALL | |
641 | class UninstallContext(InstallContext): | |
642 | '''removes the targets installed''' | |
643 | cmd='uninstall' | |
644 | def __init__(self,**kw): | |
645 | super(UninstallContext,self).__init__(**kw) | |
646 | self.is_install=UNINSTALL | |
628 | 647 | def execute(self): |
629 | 648 | try: |
630 | 649 | def runnable_status(self): |
650 | 669 | Logs.debug('build: clean called') |
651 | 670 | if self.bldnode!=self.srcnode: |
652 | 671 | lst=[] |
653 | for e in self.all_envs.values(): | |
654 | lst.extend(self.root.find_or_declare(f)for f in e[CFG_FILES]) | |
672 | for env in self.all_envs.values(): | |
673 | lst.extend(self.root.find_or_declare(f)for f in env[CFG_FILES]) | |
655 | 674 | for n in self.bldnode.ant_glob('**/*',excl='.lock* *conf_check_*/** config.log c4che/*',quiet=True): |
656 | 675 | if n in lst: |
657 | 676 | continue |
658 | 677 | n.delete() |
659 | 678 | self.root.children={} |
660 | for v in'node_deps task_sigs raw_deps'.split(): | |
679 | for v in SAVED_ATTRS: | |
680 | if v=='root': | |
681 | continue | |
661 | 682 | setattr(self,v,{}) |
662 | 683 | class ListContext(BuildContext): |
663 | 684 | '''lists the targets to execute''' |
679 | 700 | f() |
680 | 701 | try: |
681 | 702 | self.get_tgen_by_name('') |
682 | except Exception: | |
703 | except Errors.WafError: | |
683 | 704 | pass |
684 | lst=list(self.task_gen_cache_names.keys()) | |
685 | lst.sort() | |
686 | for k in lst: | |
705 | for k in sorted(self.task_gen_cache_names.keys()): | |
687 | 706 | Logs.pprint('GREEN',k) |
688 | 707 | class StepContext(BuildContext): |
689 | 708 | '''executes tasks in a step-by-step fashion, for debugging''' |
696 | 715 | Logs.warn('Add a pattern for the debug build, for example "waf step --files=main.c,app"') |
697 | 716 | BuildContext.compile(self) |
698 | 717 | return |
699 | targets=None | |
718 | targets=[] | |
700 | 719 | if self.targets and self.targets!='*': |
701 | 720 | targets=self.targets.split(',') |
702 | 721 | for g in self.groups: |
728 | 747 | break |
729 | 748 | if do_exec: |
730 | 749 | ret=tsk.run() |
731 | Logs.info('%s -> exit %r'%(str(tsk),ret)) | |
750 | Logs.info('%s -> exit %r',tsk,ret) | |
732 | 751 | def get_matcher(self,pat): |
733 | 752 | inn=True |
734 | 753 | out=True |
756 | 775 | else: |
757 | 776 | return pattern.match(node.abspath()) |
758 | 777 | return match |
778 | class EnvContext(BuildContext): | |
779 | fun=cmd=None | |
780 | def execute(self): | |
781 | self.restore() | |
782 | if not self.all_envs: | |
783 | self.load_envs() | |
784 | self.recurse([self.run_dir]) |
23 | 23 | keys=list(keys) |
24 | 24 | keys.sort() |
25 | 25 | return keys |
26 | def __iter__(self): | |
27 | return iter(self.keys()) | |
26 | 28 | def __str__(self): |
27 | 29 | return"\n".join(["%r %r"%(x,self.__getitem__(x))for x in self.keys()]) |
28 | 30 | def __getitem__(self,key): |
29 | 31 | try: |
30 | 32 | while 1: |
31 | x=self.table.get(key,None) | |
33 | x=self.table.get(key) | |
32 | 34 | if not x is None: |
33 | 35 | return x |
34 | 36 | self=self.parent |
77 | 79 | try: |
78 | 80 | value=self.table[key] |
79 | 81 | except KeyError: |
80 | try:value=self.parent[key] | |
81 | except AttributeError:value=[] | |
82 | if isinstance(value,list): | |
83 | value=value[:] | |
82 | try: | |
83 | value=self.parent[key] | |
84 | except AttributeError: | |
85 | value=[] | |
84 | 86 | else: |
85 | value=[value] | |
87 | if isinstance(value,list): | |
88 | value=value[:] | |
89 | else: | |
90 | value=[value] | |
91 | self.table[key]=value | |
86 | 92 | else: |
87 | 93 | if not isinstance(value,list): |
88 | value=[value] | |
89 | self.table[key]=value | |
94 | self.table[key]=value=[value] | |
90 | 95 | return value |
91 | 96 | def append_value(self,var,val): |
92 | 97 | if isinstance(val,str): |
138 | 143 | for m in re_imp.finditer(code): |
139 | 144 | g=m.group |
140 | 145 | tbl[g(2)]=eval(g(3)) |
141 | Logs.debug('env: %s'%str(self.table)) | |
146 | Logs.debug('env: %s',self.table) | |
142 | 147 | def update(self,d): |
143 | for k,v in d.items(): | |
144 | self[k]=v | |
148 | self.table.update(d) | |
145 | 149 | def stash(self): |
146 | 150 | orig=self.table |
147 | 151 | tbl=self.table=self.table.copy() |
3 | 3 | |
4 | 4 | import os,shlex,sys,time,re,shutil |
5 | 5 | from waflib import ConfigSet,Utils,Options,Logs,Context,Build,Errors |
6 | BREAK='break' | |
7 | CONTINUE='continue' | |
8 | 6 | WAF_CONFIG_LOG='config.log' |
9 | 7 | autoconfig=False |
10 | 8 | conf_template='''# project %(app)s configured on %(now)s by |
79 | 77 | self.msg('Setting top to',self.srcnode.abspath()) |
80 | 78 | self.msg('Setting out to',self.bldnode.abspath()) |
81 | 79 | if id(self.srcnode)==id(self.bldnode): |
82 | Logs.warn('Setting top == out (remember to use "update_outputs")') | |
80 | Logs.warn('Setting top == out') | |
83 | 81 | elif id(self.path)!=id(self.srcnode): |
84 | 82 | if self.srcnode.is_child_of(self.path): |
85 | 83 | Logs.warn('Are you certain that you do not want to set top="." ?') |
88 | 86 | Context.top_dir=self.srcnode.abspath() |
89 | 87 | Context.out_dir=self.bldnode.abspath() |
90 | 88 | env=ConfigSet.ConfigSet() |
91 | env['argv']=sys.argv | |
92 | env['options']=Options.options.__dict__ | |
89 | env.argv=sys.argv | |
90 | env.options=Options.options.__dict__ | |
91 | env.config_cmd=self.cmd | |
93 | 92 | env.run_dir=Context.run_dir |
94 | 93 | env.top_dir=Context.top_dir |
95 | 94 | env.out_dir=Context.out_dir |
96 | env['hash']=self.hash | |
97 | env['files']=self.files | |
98 | env['environ']=dict(self.environ) | |
95 | env.hash=self.hash | |
96 | env.files=self.files | |
97 | env.environ=dict(self.environ) | |
99 | 98 | if not self.env.NO_LOCK_IN_RUN and not getattr(Options.options,'no_lock_in_run'): |
100 | 99 | env.store(os.path.join(Context.run_dir,Options.lockfile)) |
101 | 100 | if not self.env.NO_LOCK_IN_TOP and not getattr(Options.options,'no_lock_in_top'): |
126 | 125 | for key in self.all_envs: |
127 | 126 | tmpenv=self.all_envs[key] |
128 | 127 | tmpenv.store(os.path.join(self.cachedir.abspath(),key+Build.CACHE_SUFFIX)) |
129 | def load(self,input,tooldir=None,funs=None,with_sys_path=True): | |
128 | def load(self,input,tooldir=None,funs=None,with_sys_path=True,cache=False): | |
130 | 129 | tools=Utils.to_list(input) |
131 | 130 | if tooldir:tooldir=Utils.to_list(tooldir) |
132 | 131 | for tool in tools: |
133 | mag=(tool,id(self.env),tooldir,funs) | |
134 | if mag in self.tool_cache: | |
135 | self.to_log('(tool %s is already loaded, skipping)'%tool) | |
136 | continue | |
137 | self.tool_cache.append(mag) | |
132 | if cache: | |
133 | mag=(tool,id(self.env),tooldir,funs) | |
134 | if mag in self.tool_cache: | |
135 | self.to_log('(tool %s is already loaded, skipping)'%tool) | |
136 | continue | |
137 | self.tool_cache.append(mag) | |
138 | 138 | module=None |
139 | 139 | try: |
140 | 140 | module=Context.load_tool(tool,tooldir,ctx=self,with_sys_path=with_sys_path) |
160 | 160 | self.rules=Utils.to_list(rules) |
161 | 161 | for x in self.rules: |
162 | 162 | f=getattr(self,x) |
163 | if not f:self.fatal("No such method '%s'."%x) | |
164 | try: | |
165 | f() | |
166 | except Exception as e: | |
167 | ret=self.err_handler(x,e) | |
168 | if ret==BREAK: | |
169 | break | |
170 | elif ret==CONTINUE: | |
171 | continue | |
172 | else: | |
173 | raise | |
174 | def err_handler(self,fun,error): | |
175 | pass | |
163 | if not f: | |
164 | self.fatal('No such configuration function %r'%x) | |
165 | f() | |
176 | 166 | def conf(f): |
177 | 167 | def fun(*k,**kw): |
178 | 168 | mandatory=True |
189 | 179 | setattr(Build.BuildContext,f.__name__,fun) |
190 | 180 | return f |
191 | 181 | @conf |
192 | def add_os_flags(self,var,dest=None,dup=True): | |
182 | def add_os_flags(self,var,dest=None,dup=False): | |
193 | 183 | try: |
194 | 184 | flags=shlex.split(self.environ[var]) |
195 | 185 | except KeyError: |
198 | 188 | self.env.append_value(dest or var,flags) |
199 | 189 | @conf |
200 | 190 | def cmd_to_list(self,cmd): |
201 | if isinstance(cmd,str)and cmd.find(' '): | |
202 | try: | |
203 | os.stat(cmd) | |
204 | except OSError: | |
191 | if isinstance(cmd,str): | |
192 | if os.path.isfile(cmd): | |
193 | return[cmd] | |
194 | if os.sep=='/': | |
205 | 195 | return shlex.split(cmd) |
206 | 196 | else: |
207 | return[cmd] | |
197 | try: | |
198 | return shlex.split(cmd,posix=False) | |
199 | except TypeError: | |
200 | return shlex.split(cmd) | |
208 | 201 | return cmd |
209 | 202 | @conf |
210 | def check_waf_version(self,mini='1.7.99',maxi='1.9.0',**kw): | |
203 | def check_waf_version(self,mini='1.8.99',maxi='2.0.0',**kw): | |
211 | 204 | self.start_msg('Checking for waf version in %s-%s'%(str(mini),str(maxi)),**kw) |
212 | 205 | ver=Context.HEXVERSION |
213 | 206 | if Utils.num2ver(mini)>ver: |
238 | 231 | path_list=Utils.to_list(path_list) |
239 | 232 | else: |
240 | 233 | path_list=environ.get('PATH','').split(os.pathsep) |
241 | if var in environ: | |
242 | filename=environ[var] | |
243 | if os.path.isfile(filename): | |
244 | ret=[filename] | |
245 | else: | |
246 | ret=self.cmd_to_list(filename) | |
234 | if kw.get('value'): | |
235 | ret=self.cmd_to_list(kw['value']) | |
236 | elif var in environ: | |
237 | ret=self.cmd_to_list(environ[var]) | |
247 | 238 | elif self.env[var]: |
248 | ret=self.env[var] | |
249 | ret=self.cmd_to_list(ret) | |
239 | ret=self.cmd_to_list(self.env[var]) | |
250 | 240 | else: |
251 | 241 | if not ret: |
252 | 242 | ret=self.find_binary(filename,exts.split(','),path_list) |
262 | 252 | retmsg=ret |
263 | 253 | else: |
264 | 254 | retmsg=False |
265 | self.msg("Checking for program '%s'"%msg,retmsg,**kw) | |
266 | if not kw.get('quiet',None): | |
255 | self.msg('Checking for program %r'%msg,retmsg,**kw) | |
256 | if not kw.get('quiet'): | |
267 | 257 | self.to_log('find program=%r paths=%r var=%r -> %r'%(filename,path_list,var,ret)) |
268 | 258 | if not ret: |
269 | 259 | self.fatal(kw.get('errmsg','')or'Could not find the program %r'%filename) |
270 | interpreter=kw.get('interpreter',None) | |
260 | interpreter=kw.get('interpreter') | |
271 | 261 | if interpreter is None: |
272 | 262 | if not Utils.check_exe(ret[0],env=environ): |
273 | 263 | self.fatal('Program %r is not executable'%ret) |
306 | 296 | if cachemode==1: |
307 | 297 | try: |
308 | 298 | proj=ConfigSet.ConfigSet(os.path.join(dir,'cache_run_build')) |
309 | except OSError: | |
310 | pass | |
311 | except IOError: | |
299 | except EnvironmentError: | |
312 | 300 | pass |
313 | 301 | else: |
314 | 302 | ret=proj['cache_run_build'] |
318 | 306 | bdir=os.path.join(dir,'testbuild') |
319 | 307 | if not os.path.exists(bdir): |
320 | 308 | os.makedirs(bdir) |
321 | self.test_bld=bld=Build.BuildContext(top_dir=dir,out_dir=bdir) | |
309 | cls_name=getattr(self,'run_build_cls','build') | |
310 | self.test_bld=bld=Context.create_context(cls_name,top_dir=dir,out_dir=bdir) | |
322 | 311 | bld.init_dirs() |
323 | 312 | bld.progress_bar=0 |
324 | 313 | bld.targets='*' |
354 | 343 | def test(self,*k,**kw): |
355 | 344 | if not'env'in kw: |
356 | 345 | kw['env']=self.env.derive() |
357 | if kw.get('validate',None): | |
346 | if kw.get('validate'): | |
358 | 347 | kw['validate'](kw) |
359 | 348 | self.start_msg(kw['msg'],**kw) |
360 | 349 | ret=None |
368 | 357 | self.fatal('The configuration failed') |
369 | 358 | else: |
370 | 359 | kw['success']=ret |
371 | if kw.get('post_check',None): | |
360 | if kw.get('post_check'): | |
372 | 361 | ret=kw['post_check'](kw) |
373 | 362 | if ret: |
374 | 363 | self.end_msg(kw['errmsg'],'YELLOW',**kw) |
4 | 4 | import os,re,imp,sys |
5 | 5 | from waflib import Utils,Errors,Logs |
6 | 6 | import waflib.Node |
7 | HEXVERSION=0x1081600 | |
8 | WAFVERSION="1.8.22" | |
9 | WAFREVISION="17d4d4faa52c454eb3580e482df69b2a80e19fa7" | |
10 | ABI=98 | |
7 | HEXVERSION=0x1090600 | |
8 | WAFVERSION="1.9.6" | |
9 | WAFREVISION="dbcda7ec6a52a88c7a605a357eb5713438ac2704" | |
10 | ABI=99 | |
11 | 11 | DBFILE='.wafpickle-%s-%d-%d'%(sys.platform,sys.hexversion,ABI) |
12 | 12 | APPNAME='APPNAME' |
13 | 13 | VERSION='VERSION' |
19 | 19 | top_dir='' |
20 | 20 | out_dir='' |
21 | 21 | waf_dir='' |
22 | local_repo='' | |
23 | remote_repo='https://raw.githubusercontent.com/waf-project/waf/master/' | |
24 | remote_locs=['waflib/extras','waflib/Tools'] | |
25 | 22 | g_module=None |
26 | 23 | STDOUT=1 |
27 | 24 | STDERR=-1 |
39 | 36 | def __init__(cls,name,bases,dict): |
40 | 37 | super(store_context,cls).__init__(name,bases,dict) |
41 | 38 | name=cls.__name__ |
42 | if name=='ctx'or name=='Context': | |
39 | if name in('ctx','Context'): | |
43 | 40 | return |
44 | 41 | try: |
45 | 42 | cls.cmd |
59 | 56 | except KeyError: |
60 | 57 | global run_dir |
61 | 58 | rd=run_dir |
62 | self.node_class=type("Nod3",(waflib.Node.Node,),{}) | |
63 | self.node_class.__module__="waflib.Node" | |
59 | self.node_class=type('Nod3',(waflib.Node.Node,),{}) | |
60 | self.node_class.__module__='waflib.Node' | |
64 | 61 | self.node_class.ctx=self |
65 | 62 | self.root=self.node_class('',None) |
66 | 63 | self.cur_script=None |
68 | 65 | self.stack_path=[] |
69 | 66 | self.exec_dict={'ctx':self,'conf':self,'bld':self,'opt':self} |
70 | 67 | self.logger=None |
71 | def __hash__(self): | |
72 | return id(self) | |
73 | 68 | def finalize(self): |
74 | 69 | try: |
75 | 70 | logger=self.logger |
129 | 124 | if not user_function: |
130 | 125 | if not mandatory: |
131 | 126 | continue |
132 | raise Errors.WafError('No function %s defined in %s'%(name or self.fun,node.abspath())) | |
127 | raise Errors.WafError('No function %r defined in %s'%(name or self.fun,node.abspath())) | |
133 | 128 | user_function(self) |
134 | 129 | finally: |
135 | 130 | self.post_recurse(node) |
144 | 139 | def exec_command(self,cmd,**kw): |
145 | 140 | subprocess=Utils.subprocess |
146 | 141 | kw['shell']=isinstance(cmd,str) |
147 | Logs.debug('runner: %r'%(cmd,)) | |
148 | Logs.debug('runner_env: kw=%s'%kw) | |
142 | Logs.debug('runner: %r',cmd) | |
143 | Logs.debug('runner_env: kw=%s',kw) | |
149 | 144 | if self.logger: |
150 | 145 | self.logger.info(cmd) |
151 | 146 | if'stdout'not in kw: |
153 | 148 | if'stderr'not in kw: |
154 | 149 | kw['stderr']=subprocess.PIPE |
155 | 150 | if Logs.verbose and not kw['shell']and not Utils.check_exe(cmd[0]): |
156 | raise Errors.WafError("Program %s not found!"%cmd[0]) | |
157 | wargs={} | |
151 | raise Errors.WafError('Program %s not found!'%cmd[0]) | |
152 | cargs={} | |
158 | 153 | if'timeout'in kw: |
159 | if kw['timeout']is not None: | |
160 | wargs['timeout']=kw['timeout'] | |
154 | if sys.hexversion>=0x3030000: | |
155 | cargs['timeout']=kw['timeout'] | |
156 | if not'start_new_session'in kw: | |
157 | kw['start_new_session']=True | |
161 | 158 | del kw['timeout'] |
162 | 159 | if'input'in kw: |
163 | 160 | if kw['input']: |
164 | wargs['input']=kw['input'] | |
161 | cargs['input']=kw['input'] | |
165 | 162 | kw['stdin']=subprocess.PIPE |
166 | 163 | del kw['input'] |
167 | try: | |
168 | if kw['stdout']or kw['stderr']: | |
169 | p=subprocess.Popen(cmd,**kw) | |
170 | (out,err)=p.communicate(**wargs) | |
171 | ret=p.returncode | |
172 | else: | |
173 | out,err=(None,None) | |
174 | ret=subprocess.Popen(cmd,**kw).wait(**wargs) | |
164 | if'cwd'in kw: | |
165 | if not isinstance(kw['cwd'],str): | |
166 | kw['cwd']=kw['cwd'].abspath() | |
167 | try: | |
168 | ret,out,err=Utils.run_process(cmd,kw,cargs) | |
175 | 169 | except Exception as e: |
176 | 170 | raise Errors.WafError('Execution failure: %s'%str(e),ex=e) |
177 | 171 | if out: |
178 | 172 | if not isinstance(out,str): |
179 | out=out.decode(sys.stdout.encoding or'iso8859-1') | |
173 | out=out.decode(sys.stdout.encoding or'iso8859-1',errors='replace') | |
180 | 174 | if self.logger: |
181 | self.logger.debug('out: %s'%out) | |
175 | self.logger.debug('out: %s',out) | |
182 | 176 | else: |
183 | 177 | Logs.info(out,extra={'stream':sys.stdout,'c1':''}) |
184 | 178 | if err: |
185 | 179 | if not isinstance(err,str): |
186 | err=err.decode(sys.stdout.encoding or'iso8859-1') | |
180 | err=err.decode(sys.stdout.encoding or'iso8859-1',errors='replace') | |
187 | 181 | if self.logger: |
188 | 182 | self.logger.error('err: %s'%err) |
189 | 183 | else: |
192 | 186 | def cmd_and_log(self,cmd,**kw): |
193 | 187 | subprocess=Utils.subprocess |
194 | 188 | kw['shell']=isinstance(cmd,str) |
195 | Logs.debug('runner: %r'%(cmd,)) | |
189 | Logs.debug('runner: %r',cmd) | |
196 | 190 | if'quiet'in kw: |
197 | 191 | quiet=kw['quiet'] |
198 | 192 | del kw['quiet'] |
204 | 198 | else: |
205 | 199 | to_ret=STDOUT |
206 | 200 | if Logs.verbose and not kw['shell']and not Utils.check_exe(cmd[0]): |
207 | raise Errors.WafError("Program %s not found!"%cmd[0]) | |
201 | raise Errors.WafError('Program %r not found!'%cmd[0]) | |
208 | 202 | kw['stdout']=kw['stderr']=subprocess.PIPE |
209 | 203 | if quiet is None: |
210 | 204 | self.to_log(cmd) |
211 | wargs={} | |
205 | cargs={} | |
212 | 206 | if'timeout'in kw: |
213 | if kw['timeout']is not None: | |
214 | wargs['timeout']=kw['timeout'] | |
207 | if sys.hexversion>=0x3030000: | |
208 | cargs['timeout']=kw['timeout'] | |
209 | if not'start_new_session'in kw: | |
210 | kw['start_new_session']=True | |
215 | 211 | del kw['timeout'] |
216 | 212 | if'input'in kw: |
217 | 213 | if kw['input']: |
218 | wargs['input']=kw['input'] | |
214 | cargs['input']=kw['input'] | |
219 | 215 | kw['stdin']=subprocess.PIPE |
220 | 216 | del kw['input'] |
221 | try: | |
222 | p=subprocess.Popen(cmd,**kw) | |
223 | (out,err)=p.communicate(**wargs) | |
217 | if'cwd'in kw: | |
218 | if not isinstance(kw['cwd'],str): | |
219 | kw['cwd']=kw['cwd'].abspath() | |
220 | try: | |
221 | ret,out,err=Utils.run_process(cmd,kw,cargs) | |
224 | 222 | except Exception as e: |
225 | 223 | raise Errors.WafError('Execution failure: %s'%str(e),ex=e) |
226 | 224 | if not isinstance(out,str): |
227 | out=out.decode(sys.stdout.encoding or'iso8859-1') | |
225 | out=out.decode(sys.stdout.encoding or'iso8859-1',errors='replace') | |
228 | 226 | if not isinstance(err,str): |
229 | err=err.decode(sys.stdout.encoding or'iso8859-1') | |
227 | err=err.decode(sys.stdout.encoding or'iso8859-1',errors='replace') | |
230 | 228 | if out and quiet!=STDOUT and quiet!=BOTH: |
231 | 229 | self.to_log('out: %s'%out) |
232 | 230 | if err and quiet!=STDERR and quiet!=BOTH: |
233 | 231 | self.to_log('err: %s'%err) |
234 | if p.returncode: | |
235 | e=Errors.WafError('Command %r returned %r'%(cmd,p.returncode)) | |
236 | e.returncode=p.returncode | |
232 | if ret: | |
233 | e=Errors.WafError('Command %r returned %r'%(cmd,ret)) | |
234 | e.returncode=ret | |
237 | 235 | e.stderr=err |
238 | 236 | e.stdout=out |
239 | 237 | raise e |
247 | 245 | self.logger.info('from %s: %s'%(self.path.abspath(),msg)) |
248 | 246 | try: |
249 | 247 | msg='%s\n(complete log in %s)'%(msg,self.logger.handlers[0].baseFilename) |
250 | except Exception: | |
248 | except AttributeError: | |
251 | 249 | pass |
252 | 250 | raise self.errors.ConfigurationError(msg,ex=ex) |
253 | 251 | def to_log(self,msg): |
268 | 266 | result=kw['result'] |
269 | 267 | except KeyError: |
270 | 268 | result=k[1] |
271 | color=kw.get('color',None) | |
269 | color=kw.get('color') | |
272 | 270 | if not isinstance(color,str): |
273 | 271 | color=result and'GREEN'or'YELLOW' |
274 | 272 | self.end_msg(result,color,**kw) |
275 | 273 | def start_msg(self,*k,**kw): |
276 | if kw.get('quiet',None): | |
277 | return | |
278 | msg=kw.get('msg',None)or k[0] | |
274 | if kw.get('quiet'): | |
275 | return | |
276 | msg=kw.get('msg')or k[0] | |
279 | 277 | try: |
280 | 278 | if self.in_msg: |
281 | 279 | self.in_msg+=1 |
291 | 289 | self.to_log(x) |
292 | 290 | Logs.pprint('NORMAL',"%s :"%msg.ljust(self.line_just),sep='') |
293 | 291 | def end_msg(self,*k,**kw): |
294 | if kw.get('quiet',None): | |
292 | if kw.get('quiet'): | |
295 | 293 | return |
296 | 294 | self.in_msg-=1 |
297 | 295 | if self.in_msg: |
298 | 296 | return |
299 | result=kw.get('result',None)or k[0] | |
297 | result=kw.get('result')or k[0] | |
300 | 298 | defcolor='GREEN' |
301 | 299 | if result==True: |
302 | 300 | msg='ok' |
326 | 324 | waflibs=PyZipFile(waf_dir) |
327 | 325 | lst=waflibs.namelist() |
328 | 326 | for x in lst: |
329 | if not re.match("waflib/extras/%s"%var.replace("*",".*"),var): | |
327 | if not re.match('waflib/extras/%s'%var.replace('*','.*'),var): | |
330 | 328 | continue |
331 | 329 | f=os.path.basename(x) |
332 | 330 | doban=False |
333 | 331 | for b in ban: |
334 | r=b.replace("*",".*") | |
332 | r=b.replace('*','.*') | |
335 | 333 | if re.match(r,f): |
336 | 334 | doban=True |
337 | 335 | if not doban: |
350 | 348 | raise Errors.WafError('Could not read the file %r'%path) |
351 | 349 | module_dir=os.path.dirname(path) |
352 | 350 | sys.path.insert(0,module_dir) |
353 | try:exec(compile(code,path,'exec'),module.__dict__) | |
354 | finally:sys.path.remove(module_dir) | |
351 | try: | |
352 | exec(compile(code,path,'exec'),module.__dict__) | |
353 | finally: | |
354 | sys.path.remove(module_dir) | |
355 | 355 | cache_modules[path]=module |
356 | 356 | return module |
357 | 357 | def load_tool(tool,tooldir=None,ctx=None,with_sys_path=True): |
359 | 359 | tool='javaw' |
360 | 360 | else: |
361 | 361 | tool=tool.replace('++','xx') |
362 | origSysPath=sys.path | |
363 | if not with_sys_path:sys.path=[] | |
362 | if not with_sys_path: | |
363 | back_path=sys.path | |
364 | sys.path=[] | |
364 | 365 | try: |
365 | 366 | if tooldir: |
366 | 367 | assert isinstance(tooldir,list) |
382 | 383 | break |
383 | 384 | except ImportError: |
384 | 385 | x=None |
385 | if x is None: | |
386 | else: | |
386 | 387 | __import__(tool) |
387 | 388 | finally: |
388 | 389 | if not with_sys_path:sys.path.remove(waf_dir) |
390 | 391 | Context.tools[tool]=ret |
391 | 392 | return ret |
392 | 393 | finally: |
393 | if not with_sys_path:sys.path+=origSysPath | |
394 | if not with_sys_path: | |
395 | sys.path+=back_path |
11 | 11 | import logging |
12 | 12 | LOG_FORMAT=os.environ.get('WAF_LOG_FORMAT','%(asctime)s %(c1)s%(zone)s%(c2)s %(message)s') |
13 | 13 | HOUR_FORMAT=os.environ.get('WAF_HOUR_FORMAT','%H:%M:%S') |
14 | zones='' | |
14 | zones=[] | |
15 | 15 | verbose=0 |
16 | 16 | colors_lst={'USE':True,'BOLD':'\x1b[01;1m','RED':'\x1b[01;31m','GREEN':'\x1b[32m','YELLOW':'\x1b[33m','PINK':'\x1b[35m','BLUE':'\x1b[01;34m','CYAN':'\x1b[36m','GREY':'\x1b[37m','NORMAL':'\x1b[0m','cursor_on':'\x1b[?25h','cursor_off':'\x1b[?25l',} |
17 | 17 | indicator='\r\x1b[K%s%s%s' |
38 | 38 | def get_term_cols(): |
39 | 39 | return 80 |
40 | 40 | get_term_cols.__doc__=""" |
41 | Get the console width in characters. | |
41 | Returns the console width in characters. | |
42 | 42 | |
43 | 43 | :return: the number of characters per line |
44 | 44 | :rtype: int |
45 | 45 | """ |
46 | 46 | def get_color(cl): |
47 | if not colors_lst['USE']:return'' | |
48 | return colors_lst.get(cl,'') | |
47 | if colors_lst['USE']: | |
48 | return colors_lst.get(cl,'') | |
49 | return'' | |
49 | 50 | class color_dict(object): |
50 | 51 | def __getattr__(self,a): |
51 | 52 | return get_color(a) |
54 | 55 | colors=color_dict() |
55 | 56 | re_log=re.compile(r'(\w+): (.*)',re.M) |
56 | 57 | class log_filter(logging.Filter): |
57 | def __init__(self,name=None): | |
58 | pass | |
58 | def __init__(self,name=''): | |
59 | logging.Filter.__init__(self,name) | |
59 | 60 | def filter(self,rec): |
61 | global verbose | |
60 | 62 | rec.zone=rec.module |
61 | 63 | if rec.levelno>=logging.INFO: |
62 | 64 | return True |
128 | 130 | else: |
129 | 131 | msg=re.sub(r'\r(?!\n)|\x1B\[(K|.*?(m|h|l))','',msg) |
130 | 132 | if rec.levelno>=logging.INFO: |
133 | if rec.args: | |
134 | return msg%rec.args | |
131 | 135 | return msg |
132 | 136 | rec.msg=msg |
133 | 137 | rec.c1=colors.PINK |
135 | 139 | return logging.Formatter.format(self,rec) |
136 | 140 | log=None |
137 | 141 | def debug(*k,**kw): |
142 | global verbose | |
138 | 143 | if verbose: |
139 | 144 | k=list(k) |
140 | 145 | k[0]=k[0].replace('\n',' ') |
141 | 146 | global log |
142 | 147 | log.debug(*k,**kw) |
143 | 148 | def error(*k,**kw): |
144 | global log | |
149 | global log,verbose | |
145 | 150 | log.error(*k,**kw) |
146 | 151 | if verbose>2: |
147 | 152 | st=traceback.extract_stack() |
149 | 154 | st=st[:-1] |
150 | 155 | buf=[] |
151 | 156 | for filename,lineno,name,line in st: |
152 | buf.append(' File "%s", line %d, in %s'%(filename,lineno,name)) | |
157 | buf.append(' File %r, line %d, in %s'%(filename,lineno,name)) | |
153 | 158 | if line: |
154 | 159 | buf.append(' %s'%line.strip()) |
155 | if buf:log.error("\n".join(buf)) | |
160 | if buf:log.error('\n'.join(buf)) | |
156 | 161 | def warn(*k,**kw): |
157 | 162 | global log |
158 | 163 | log.warn(*k,**kw) |
195 | 200 | except Exception: |
196 | 201 | pass |
197 | 202 | def pprint(col,msg,label='',sep='\n'): |
198 | info("%s%s%s %s"%(colors(col),msg,colors.NORMAL,label),extra={'terminator':sep}) | |
203 | global info | |
204 | info('%s%s%s %s',colors(col),msg,colors.NORMAL,label,extra={'terminator':sep}) |
34 | 34 | **/_darcs/** |
35 | 35 | **/.intlcache |
36 | 36 | **/.DS_Store''' |
37 | split_path=Utils.split_path | |
38 | split_path_unix=Utils.split_path_unix | |
39 | split_path_cygwin=Utils.split_path_cygwin | |
40 | split_path_win32=Utils.split_path_win32 | |
41 | 37 | class Node(object): |
42 | 38 | dict_class=dict |
43 | __slots__=('name','sig','children','parent','cache_abspath','cache_isdir','cache_sig') | |
39 | __slots__=('name','parent','children','cache_abspath','cache_isdir') | |
44 | 40 | def __init__(self,name,parent): |
45 | 41 | self.name=name |
46 | 42 | self.parent=parent |
53 | 49 | self.parent=data[1] |
54 | 50 | if data[2]is not None: |
55 | 51 | self.children=self.dict_class(data[2]) |
56 | if data[3]is not None: | |
57 | self.sig=data[3] | |
58 | 52 | def __getstate__(self): |
59 | return(self.name,self.parent,getattr(self,'children',None),getattr(self,'sig',None)) | |
53 | return(self.name,self.parent,getattr(self,'children',None)) | |
60 | 54 | def __str__(self): |
61 | return self.name | |
55 | return self.abspath() | |
62 | 56 | def __repr__(self): |
63 | 57 | return self.abspath() |
64 | def __hash__(self): | |
65 | return id(self) | |
66 | def __eq__(self,node): | |
67 | return id(self)==id(node) | |
68 | 58 | def __copy__(self): |
69 | 59 | raise Errors.WafError('nodes are not supposed to be copied') |
70 | 60 | def read(self,flags='r',encoding='ISO8859-1'): |
102 | 92 | newline='' |
103 | 93 | output=json.dumps(data,indent=indent,separators=separators,sort_keys=sort_keys)+newline |
104 | 94 | self.write(output,encoding='utf-8') |
95 | def exists(self): | |
96 | return os.path.exists(self.abspath()) | |
97 | def isdir(self): | |
98 | return os.path.isdir(self.abspath()) | |
105 | 99 | def chmod(self,val): |
106 | 100 | os.chmod(self.abspath(),val) |
107 | def delete(self): | |
108 | try: | |
109 | try: | |
110 | if hasattr(self,'children'): | |
101 | def delete(self,evict=True): | |
102 | try: | |
103 | try: | |
104 | if os.path.isdir(self.abspath()): | |
111 | 105 | shutil.rmtree(self.abspath()) |
112 | 106 | else: |
113 | 107 | os.remove(self.abspath()) |
115 | 109 | if os.path.exists(self.abspath()): |
116 | 110 | raise e |
117 | 111 | finally: |
118 | self.evict() | |
112 | if evict: | |
113 | self.evict() | |
119 | 114 | def evict(self): |
120 | 115 | del self.parent.children[self.name] |
121 | 116 | def suffix(self): |
133 | 128 | lst.sort() |
134 | 129 | return lst |
135 | 130 | def mkdir(self): |
136 | if getattr(self,'cache_isdir',None): | |
131 | if self.isdir(): | |
137 | 132 | return |
138 | 133 | try: |
139 | 134 | self.parent.mkdir() |
144 | 139 | os.makedirs(self.abspath()) |
145 | 140 | except OSError: |
146 | 141 | pass |
147 | if not os.path.isdir(self.abspath()): | |
148 | raise Errors.WafError('Could not create the directory %s'%self.abspath()) | |
142 | if not self.isdir(): | |
143 | raise Errors.WafError('Could not create the directory %r'%self) | |
149 | 144 | try: |
150 | 145 | self.children |
151 | 146 | except AttributeError: |
152 | 147 | self.children=self.dict_class() |
153 | self.cache_isdir=True | |
154 | 148 | def find_node(self,lst): |
155 | 149 | if isinstance(lst,str): |
156 | lst=[x for x in split_path(lst)if x and x!='.'] | |
150 | lst=[x for x in Utils.split_path(lst)if x and x!='.'] | |
157 | 151 | cur=self |
158 | 152 | for x in lst: |
159 | 153 | if x=='..': |
170 | 164 | except KeyError: |
171 | 165 | pass |
172 | 166 | cur=self.__class__(x,cur) |
173 | try: | |
174 | os.stat(cur.abspath()) | |
175 | except OSError: | |
167 | if not cur.exists(): | |
176 | 168 | cur.evict() |
177 | 169 | return None |
178 | ret=cur | |
179 | try: | |
180 | os.stat(ret.abspath()) | |
181 | except OSError: | |
182 | ret.evict() | |
170 | if not cur.exists(): | |
171 | cur.evict() | |
183 | 172 | return None |
184 | try: | |
185 | while not getattr(cur.parent,'cache_isdir',None): | |
186 | cur=cur.parent | |
187 | cur.cache_isdir=True | |
188 | except AttributeError: | |
189 | pass | |
190 | return ret | |
173 | return cur | |
191 | 174 | def make_node(self,lst): |
192 | 175 | if isinstance(lst,str): |
193 | lst=[x for x in split_path(lst)if x and x!='.'] | |
176 | lst=[x for x in Utils.split_path(lst)if x and x!='.'] | |
194 | 177 | cur=self |
195 | 178 | for x in lst: |
196 | 179 | if x=='..': |
197 | 180 | cur=cur.parent or cur |
198 | 181 | continue |
199 | if getattr(cur,'children',{}): | |
200 | if x in cur.children: | |
201 | cur=cur.children[x] | |
202 | continue | |
182 | try: | |
183 | cur=cur.children[x] | |
184 | except AttributeError: | |
185 | cur.children=self.dict_class() | |
186 | except KeyError: | |
187 | pass | |
203 | 188 | else: |
204 | cur.children=self.dict_class() | |
189 | continue | |
205 | 190 | cur=self.__class__(x,cur) |
206 | 191 | return cur |
207 | 192 | def search_node(self,lst): |
208 | 193 | if isinstance(lst,str): |
209 | lst=[x for x in split_path(lst)if x and x!='.'] | |
194 | lst=[x for x in Utils.split_path(lst)if x and x!='.'] | |
210 | 195 | cur=self |
211 | 196 | for x in lst: |
212 | 197 | if x=='..': |
232 | 217 | up+=1 |
233 | 218 | c2=c2.parent |
234 | 219 | c2h-=1 |
235 | while id(c1)!=id(c2): | |
220 | while not c1 is c2: | |
236 | 221 | lst.append(c1.name) |
237 | 222 | up+=1 |
238 | 223 | c1=c1.parent |
278 | 263 | while diff>0: |
279 | 264 | diff-=1 |
280 | 265 | p=p.parent |
281 | return id(p)==id(node) | |
266 | return p is node | |
282 | 267 | def ant_iter(self,accept=None,maxdepth=25,pats=[],dir=False,src=True,remove=True): |
283 | 268 | dircont=self.listdir() |
284 | 269 | dircont.sort() |
295 | 280 | if npats and npats[0]: |
296 | 281 | accepted=[]in npats[0] |
297 | 282 | node=self.make_node([name]) |
298 | isdir=os.path.isdir(node.abspath()) | |
283 | isdir=node.isdir() | |
299 | 284 | if accepted: |
300 | 285 | if isdir: |
301 | 286 | if dir: |
303 | 288 | else: |
304 | 289 | if src: |
305 | 290 | yield node |
306 | if getattr(node,'cache_isdir',None)or isdir: | |
291 | if isdir: | |
307 | 292 | node.cache_isdir=True |
308 | 293 | if maxdepth: |
309 | 294 | for k in node.ant_iter(accept=accept,maxdepth=maxdepth-1,pats=npats,dir=dir,src=src,remove=remove): |
333 | 318 | try: |
334 | 319 | accu.append(re.compile(k,flags=reflags)) |
335 | 320 | except Exception as e: |
336 | raise Errors.WafError("Invalid pattern: %s"%k,e) | |
321 | raise Errors.WafError('Invalid pattern: %s'%k,e) | |
337 | 322 | ret.append(accu) |
338 | 323 | return ret |
339 | 324 | def filtre(name,nn): |
363 | 348 | return ret |
364 | 349 | def is_src(self): |
365 | 350 | cur=self |
366 | x=id(self.ctx.srcnode) | |
367 | y=id(self.ctx.bldnode) | |
351 | x=self.ctx.srcnode | |
352 | y=self.ctx.bldnode | |
368 | 353 | while cur.parent: |
369 | if id(cur)==y: | |
354 | if cur is y: | |
370 | 355 | return False |
371 | if id(cur)==x: | |
356 | if cur is x: | |
372 | 357 | return True |
373 | 358 | cur=cur.parent |
374 | 359 | return False |
375 | 360 | def is_bld(self): |
376 | 361 | cur=self |
377 | y=id(self.ctx.bldnode) | |
362 | y=self.ctx.bldnode | |
378 | 363 | while cur.parent: |
379 | if id(cur)==y: | |
364 | if cur is y: | |
380 | 365 | return True |
381 | 366 | cur=cur.parent |
382 | 367 | return False |
383 | 368 | def get_src(self): |
384 | 369 | cur=self |
385 | x=id(self.ctx.srcnode) | |
386 | y=id(self.ctx.bldnode) | |
370 | x=self.ctx.srcnode | |
371 | y=self.ctx.bldnode | |
387 | 372 | lst=[] |
388 | 373 | while cur.parent: |
389 | if id(cur)==y: | |
374 | if cur is y: | |
390 | 375 | lst.reverse() |
391 | return self.ctx.srcnode.make_node(lst) | |
392 | if id(cur)==x: | |
376 | return x.make_node(lst) | |
377 | if cur is x: | |
393 | 378 | return self |
394 | 379 | lst.append(cur.name) |
395 | 380 | cur=cur.parent |
396 | 381 | return self |
397 | 382 | def get_bld(self): |
398 | 383 | cur=self |
399 | x=id(self.ctx.srcnode) | |
400 | y=id(self.ctx.bldnode) | |
384 | x=self.ctx.srcnode | |
385 | y=self.ctx.bldnode | |
401 | 386 | lst=[] |
402 | 387 | while cur.parent: |
403 | if id(cur)==y: | |
388 | if cur is y: | |
404 | 389 | return self |
405 | if id(cur)==x: | |
390 | if cur is x: | |
406 | 391 | lst.reverse() |
407 | 392 | return self.ctx.bldnode.make_node(lst) |
408 | 393 | lst.append(cur.name) |
413 | 398 | return self.ctx.bldnode.make_node(['__root__']+lst) |
414 | 399 | def find_resource(self,lst): |
415 | 400 | if isinstance(lst,str): |
416 | lst=[x for x in split_path(lst)if x and x!='.'] | |
401 | lst=[x for x in Utils.split_path(lst)if x and x!='.'] | |
417 | 402 | node=self.get_bld().search_node(lst) |
418 | 403 | if not node: |
419 | self=self.get_src() | |
420 | node=self.find_node(lst) | |
421 | if node: | |
422 | if os.path.isdir(node.abspath()): | |
423 | return None | |
404 | node=self.get_src().find_node(lst) | |
405 | if node and node.isdir(): | |
406 | return None | |
424 | 407 | return node |
425 | 408 | def find_or_declare(self,lst): |
426 | 409 | if isinstance(lst,str): |
427 | lst=[x for x in split_path(lst)if x and x!='.'] | |
410 | lst=[x for x in Utils.split_path(lst)if x and x!='.'] | |
428 | 411 | node=self.get_bld().search_node(lst) |
429 | 412 | if node: |
430 | 413 | if not os.path.isfile(node.abspath()): |
431 | node.sig=None | |
432 | 414 | node.parent.mkdir() |
433 | 415 | return node |
434 | 416 | self=self.get_src() |
435 | 417 | node=self.find_node(lst) |
436 | 418 | if node: |
437 | if not os.path.isfile(node.abspath()): | |
438 | node.sig=None | |
439 | node.parent.mkdir() | |
440 | 419 | return node |
441 | 420 | node=self.get_bld().make_node(lst) |
442 | 421 | node.parent.mkdir() |
443 | 422 | return node |
444 | 423 | def find_dir(self,lst): |
445 | 424 | if isinstance(lst,str): |
446 | lst=[x for x in split_path(lst)if x and x!='.'] | |
425 | lst=[x for x in Utils.split_path(lst)if x and x!='.'] | |
447 | 426 | node=self.find_node(lst) |
448 | try: | |
449 | if not os.path.isdir(node.abspath()): | |
450 | return None | |
451 | except(OSError,AttributeError): | |
427 | if node and not node.isdir(): | |
452 | 428 | return None |
453 | 429 | return node |
454 | 430 | def change_ext(self,ext,ext_in=None): |
468 | 444 | return self.path_from(self.ctx.srcnode) |
469 | 445 | def relpath(self): |
470 | 446 | cur=self |
471 | x=id(self.ctx.bldnode) | |
447 | x=self.ctx.bldnode | |
472 | 448 | while cur.parent: |
473 | if id(cur)==x: | |
449 | if cur is x: | |
474 | 450 | return self.bldpath() |
475 | 451 | cur=cur.parent |
476 | 452 | return self.srcpath() |
477 | 453 | def bld_dir(self): |
478 | 454 | return self.parent.bldpath() |
455 | def h_file(self): | |
456 | return Utils.h_file(self.abspath()) | |
479 | 457 | def get_bld_sig(self): |
480 | 458 | try: |
481 | return self.cache_sig | |
459 | cache=self.ctx.cache_sig | |
482 | 460 | except AttributeError: |
461 | cache=self.ctx.cache_sig={} | |
462 | try: | |
463 | ret=cache[self] | |
464 | except KeyError: | |
465 | p=self.abspath() | |
466 | try: | |
467 | ret=cache[self]=self.h_file() | |
468 | except EnvironmentError: | |
469 | if self.isdir(): | |
470 | st=os.stat(p) | |
471 | ret=cache[self]=Utils.h_list([p,st.st_ino,st.st_mode]) | |
472 | return ret | |
473 | raise | |
474 | return ret | |
475 | def get_sig(self): | |
476 | return self.h_file() | |
477 | def set_sig(self,val): | |
478 | try: | |
479 | del self.get_bld_sig.__cache__[(self,)] | |
480 | except(AttributeError,KeyError): | |
483 | 481 | pass |
484 | if not self.is_bld()or self.ctx.bldnode is self.ctx.srcnode: | |
485 | self.sig=Utils.h_file(self.abspath()) | |
486 | self.cache_sig=ret=self.sig | |
487 | return ret | |
482 | sig=property(get_sig,set_sig) | |
483 | cache_sig=property(get_sig,set_sig) | |
488 | 484 | pickle_lock=Utils.threading.Lock() |
489 | 485 | class Nod3(Node): |
490 | 486 | pass |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | 4 | import os,tempfile,optparse,sys,re |
5 | from waflib import Logs,Utils,Context | |
6 | cmds='distclean configure build install clean uninstall check dist distcheck'.split() | |
5 | from waflib import Logs,Utils,Context,Errors | |
7 | 6 | options={} |
8 | 7 | commands=[] |
9 | 8 | envvars=[] |
10 | 9 | lockfile=os.environ.get('WAFLOCK','.lock-waf_%s_build'%sys.platform) |
11 | platform=Utils.unversioned_sys_platform() | |
12 | 10 | class opt_parser(optparse.OptionParser): |
13 | 11 | def __init__(self,ctx): |
14 | 12 | optparse.OptionParser.__init__(self,conflict_handler="resolve",version='waf %s (%s)'%(Context.WAFVERSION,Context.WAFREVISION)) |
56 | 54 | p('-k','--keep',dest='keep',default=0,action='count',help='continue despite errors (-kk to try harder)') |
57 | 55 | p('-v','--verbose',dest='verbose',default=0,action='count',help='verbosity level -v -vv or -vvv [default: 0]') |
58 | 56 | p('--zones',dest='zones',default='',action='store',help='debugging zones (task_gen, deps, tasks, etc)') |
57 | p('--profile',dest='profile',default='',action='store_true',help=optparse.SUPPRESS_HELP) | |
59 | 58 | gr=self.add_option_group('Configuration options') |
60 | 59 | self.option_groups['configure options']=gr |
61 | 60 | gr.add_option('-o','--out',action='store',default='',help='build dir for the project',dest='out') |
65 | 64 | gr.add_option('--no-lock-in-top',action='store_true',default='',help=optparse.SUPPRESS_HELP,dest='no_lock_in_top') |
66 | 65 | default_prefix=getattr(Context.g_module,'default_prefix',os.environ.get('PREFIX')) |
67 | 66 | if not default_prefix: |
68 | if platform=='win32': | |
67 | if Utils.unversioned_sys_platform()=='win32': | |
69 | 68 | d=tempfile.gettempdir() |
70 | 69 | default_prefix=d[0].upper()+d[1:] |
71 | 70 | else: |
100 | 99 | if not count and os.name not in('nt','java'): |
101 | 100 | try: |
102 | 101 | tmp=self.cmd_and_log(['sysctl','-n','hw.ncpu'],quiet=0) |
103 | except Exception: | |
102 | except Errors.WafError: | |
104 | 103 | pass |
105 | 104 | else: |
106 | 105 | if re.match('^[0-9]+$',tmp): |
144 | 143 | def execute(self): |
145 | 144 | super(OptionsContext,self).execute() |
146 | 145 | self.parse_args() |
146 | Utils.alloc_process_pool(options.jobs) |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import random,atexit | |
4 | import random | |
5 | 5 | try: |
6 | 6 | from queue import Queue |
7 | 7 | except ImportError: |
8 | 8 | from Queue import Queue |
9 | 9 | from waflib import Utils,Task,Errors,Logs |
10 | GAP=10 | |
11 | class TaskConsumer(Utils.threading.Thread): | |
12 | def __init__(self): | |
10 | GAP=20 | |
11 | class Consumer(Utils.threading.Thread): | |
12 | def __init__(self,spawner,task): | |
13 | 13 | Utils.threading.Thread.__init__(self) |
14 | self.ready=Queue() | |
14 | self.task=task | |
15 | self.spawner=spawner | |
16 | self.setDaemon(1) | |
17 | self.start() | |
18 | def run(self): | |
19 | try: | |
20 | if not self.spawner.master.stop: | |
21 | self.task.process() | |
22 | finally: | |
23 | self.spawner.sem.release() | |
24 | self.spawner.master.out.put(self.task) | |
25 | self.task=None | |
26 | self.spawner=None | |
27 | class Spawner(Utils.threading.Thread): | |
28 | def __init__(self,master): | |
29 | Utils.threading.Thread.__init__(self) | |
30 | self.master=master | |
31 | self.sem=Utils.threading.Semaphore(master.numjobs) | |
15 | 32 | self.setDaemon(1) |
16 | 33 | self.start() |
17 | 34 | def run(self): |
20 | 37 | except Exception: |
21 | 38 | pass |
22 | 39 | def loop(self): |
40 | master=self.master | |
23 | 41 | while 1: |
24 | tsk=self.ready.get() | |
25 | if not isinstance(tsk,Task.TaskBase): | |
26 | tsk(self) | |
27 | else: | |
28 | tsk.process() | |
29 | pool=Queue() | |
30 | def get_pool(): | |
31 | try: | |
32 | return pool.get(False) | |
33 | except Exception: | |
34 | return TaskConsumer() | |
35 | def put_pool(x): | |
36 | pool.put(x) | |
37 | def _free_resources(): | |
38 | global pool | |
39 | lst=[] | |
40 | while pool.qsize(): | |
41 | lst.append(pool.get()) | |
42 | for x in lst: | |
43 | x.ready.put(None) | |
44 | for x in lst: | |
45 | x.join() | |
46 | pool=None | |
47 | atexit.register(_free_resources) | |
42 | task=master.ready.get() | |
43 | self.sem.acquire() | |
44 | if not master.stop: | |
45 | task.log_display(task.generator.bld) | |
46 | Consumer(self,task) | |
48 | 47 | class Parallel(object): |
49 | 48 | def __init__(self,bld,j=2): |
50 | 49 | self.numjobs=j |
51 | 50 | self.bld=bld |
52 | self.outstanding=[] | |
53 | self.frozen=[] | |
51 | self.outstanding=Utils.deque() | |
52 | self.frozen=Utils.deque() | |
53 | self.ready=Queue(0) | |
54 | 54 | self.out=Queue(0) |
55 | 55 | self.count=0 |
56 | 56 | self.processed=1 |
58 | 58 | self.error=[] |
59 | 59 | self.biter=None |
60 | 60 | self.dirty=False |
61 | self.spawner=Spawner(self) | |
61 | 62 | def get_next_task(self): |
62 | 63 | if not self.outstanding: |
63 | 64 | return None |
64 | return self.outstanding.pop(0) | |
65 | return self.outstanding.popleft() | |
65 | 66 | def postpone(self,tsk): |
66 | 67 | if random.randint(0,1): |
67 | self.frozen.insert(0,tsk) | |
68 | self.frozen.appendleft(tsk) | |
68 | 69 | else: |
69 | 70 | self.frozen.append(tsk) |
70 | 71 | def refill_task_list(self): |
91 | 92 | raise Errors.WafError('Deadlock detected: %s%s'%(msg,''.join(lst))) |
92 | 93 | self.deadlock=self.processed |
93 | 94 | if self.frozen: |
94 | self.outstanding+=self.frozen | |
95 | self.frozen=[] | |
95 | self.outstanding.extend(self.frozen) | |
96 | self.frozen.clear() | |
96 | 97 | elif not self.count: |
97 | 98 | self.outstanding.extend(next(self.biter)) |
98 | 99 | self.total=self.bld.total() |
99 | 100 | break |
100 | 101 | def add_more_tasks(self,tsk): |
101 | 102 | if getattr(tsk,'more_tasks',None): |
102 | self.outstanding+=tsk.more_tasks | |
103 | self.outstanding.extend(tsk.more_tasks) | |
103 | 104 | self.total+=len(tsk.more_tasks) |
104 | 105 | def get_out(self): |
105 | 106 | tsk=self.out.get() |
109 | 110 | self.dirty=True |
110 | 111 | return tsk |
111 | 112 | def add_task(self,tsk): |
112 | try: | |
113 | self.pool | |
114 | except AttributeError: | |
115 | self.init_task_pool() | |
116 | 113 | self.ready.put(tsk) |
117 | def init_task_pool(self): | |
118 | pool=self.pool=[get_pool()for i in range(self.numjobs)] | |
119 | self.ready=Queue(0) | |
120 | def setq(consumer): | |
121 | consumer.ready=self.ready | |
122 | for x in pool: | |
123 | x.ready.put(setq) | |
124 | return pool | |
125 | def free_task_pool(self): | |
126 | def setq(consumer): | |
127 | consumer.ready=Queue(0) | |
128 | self.out.put(self) | |
129 | try: | |
130 | pool=self.pool | |
131 | except AttributeError: | |
132 | pass | |
133 | else: | |
134 | for x in pool: | |
135 | self.ready.put(setq) | |
136 | for x in pool: | |
137 | self.get_out() | |
138 | for x in pool: | |
139 | put_pool(x) | |
140 | self.pool=[] | |
141 | 114 | def skip(self,tsk): |
142 | 115 | tsk.hasrun=Task.SKIPPED |
143 | 116 | def error_handler(self,tsk): |
144 | 117 | if hasattr(tsk,'scan')and hasattr(tsk,'uid'): |
145 | key=(tsk.uid(),'imp') | |
146 | 118 | try: |
147 | del self.bld.task_sigs[key] | |
119 | del self.bld.imp_sigs[tsk.uid()] | |
148 | 120 | except KeyError: |
149 | 121 | pass |
150 | 122 | if not self.bld.keep: |
186 | 158 | break |
187 | 159 | st=self.task_status(tsk) |
188 | 160 | if st==Task.RUN_ME: |
189 | tsk.position=(self.processed,self.total) | |
190 | 161 | self.count+=1 |
191 | tsk.master=self | |
192 | 162 | self.processed+=1 |
193 | 163 | if self.numjobs==1: |
194 | tsk.process() | |
164 | tsk.log_display(tsk.generator.bld) | |
165 | try: | |
166 | tsk.process() | |
167 | finally: | |
168 | self.out.put(tsk) | |
195 | 169 | else: |
196 | 170 | self.add_task(tsk) |
197 | 171 | if st==Task.ASK_LATER: |
202 | 176 | self.add_more_tasks(tsk) |
203 | 177 | while self.error and self.count: |
204 | 178 | self.get_out() |
179 | self.ready.put(None) | |
205 | 180 | assert(self.count==0 or self.stop) |
206 | self.free_task_pool() |
9 | 9 | def waf_entry_point(current_directory,version,wafdir): |
10 | 10 | Logs.init_log() |
11 | 11 | if Context.WAFVERSION!=version: |
12 | Logs.error('Waf script %r and library %r do not match (directory %r)'%(version,Context.WAFVERSION,wafdir)) | |
12 | Logs.error('Waf script %r and library %r do not match (directory %r)',version,Context.WAFVERSION,wafdir) | |
13 | 13 | sys.exit(1) |
14 | 14 | if'--version'in sys.argv: |
15 | 15 | Context.run_dir=current_directory |
24 | 24 | sys.argv.pop(1) |
25 | 25 | Context.waf_dir=wafdir |
26 | 26 | Context.launch_dir=current_directory |
27 | no_climb=os.environ.get('NOCLIMB',None) | |
27 | no_climb=os.environ.get('NOCLIMB') | |
28 | 28 | if not no_climb: |
29 | 29 | for k in no_climb_commands: |
30 | 30 | for y in sys.argv: |
44 | 44 | lst=os.listdir(cur) |
45 | 45 | except OSError: |
46 | 46 | lst=[] |
47 | Logs.error('Directory %r is unreadable!'%cur) | |
47 | Logs.error('Directory %r is unreadable!',cur) | |
48 | 48 | if Options.lockfile in lst: |
49 | 49 | env=ConfigSet.ConfigSet() |
50 | 50 | try: |
51 | 51 | env.load(os.path.join(cur,Options.lockfile)) |
52 | 52 | ino=os.stat(cur)[stat.ST_INO] |
53 | except Exception: | |
53 | except EnvironmentError: | |
54 | 54 | pass |
55 | 55 | else: |
56 | 56 | for x in(env.run_dir,env.top_dir,env.out_dir): |
68 | 68 | load=True |
69 | 69 | break |
70 | 70 | else: |
71 | Logs.warn('invalid lock file in %s'%cur) | |
71 | Logs.warn('invalid lock file in %s',cur) | |
72 | 72 | load=False |
73 | 73 | if load: |
74 | 74 | Context.run_dir=env.run_dir |
92 | 92 | ctx.curdir=current_directory |
93 | 93 | ctx.parse_args() |
94 | 94 | sys.exit(0) |
95 | Logs.error('Waf: Run from a directory containing a file named %r'%Context.WSCRIPT_FILE) | |
95 | Logs.error('Waf: Run from a directory containing a file named %r',Context.WSCRIPT_FILE) | |
96 | 96 | sys.exit(1) |
97 | 97 | try: |
98 | 98 | os.chdir(Context.run_dir) |
99 | 99 | except OSError: |
100 | Logs.error('Waf: The folder %r is unreadable'%Context.run_dir) | |
100 | Logs.error('Waf: The folder %r is unreadable',Context.run_dir) | |
101 | 101 | sys.exit(1) |
102 | 102 | try: |
103 | 103 | set_main_module(os.path.normpath(os.path.join(Context.run_dir,Context.WSCRIPT_FILE))) |
106 | 106 | Logs.error(str(e)) |
107 | 107 | sys.exit(1) |
108 | 108 | except Exception as e: |
109 | Logs.error('Waf: The wscript in %r is unreadable'%Context.run_dir,e) | |
109 | Logs.error('Waf: The wscript in %r is unreadable',Context.run_dir) | |
110 | 110 | traceback.print_exc(file=sys.stdout) |
111 | 111 | sys.exit(2) |
112 | try: | |
113 | run_commands() | |
114 | except Errors.WafError as e: | |
115 | if Logs.verbose>1: | |
116 | Logs.pprint('RED',e.verbose_msg) | |
117 | Logs.error(e.msg) | |
118 | sys.exit(1) | |
119 | except SystemExit: | |
120 | raise | |
121 | except Exception as e: | |
122 | traceback.print_exc(file=sys.stdout) | |
123 | sys.exit(2) | |
124 | except KeyboardInterrupt: | |
125 | Logs.pprint('RED','Interrupted') | |
126 | sys.exit(68) | |
112 | if'--profile'in sys.argv: | |
113 | import cProfile,pstats | |
114 | cProfile.runctx('from waflib import Scripting; Scripting.run_commands()',{},{},'profi.txt') | |
115 | p=pstats.Stats('profi.txt') | |
116 | p.sort_stats('time').print_stats(75) | |
117 | else: | |
118 | try: | |
119 | run_commands() | |
120 | except Errors.WafError as e: | |
121 | if Logs.verbose>1: | |
122 | Logs.pprint('RED',e.verbose_msg) | |
123 | Logs.error(e.msg) | |
124 | sys.exit(1) | |
125 | except SystemExit: | |
126 | raise | |
127 | except Exception as e: | |
128 | traceback.print_exc(file=sys.stdout) | |
129 | sys.exit(2) | |
130 | except KeyboardInterrupt: | |
131 | Logs.pprint('RED','Interrupted') | |
132 | sys.exit(68) | |
127 | 133 | def set_main_module(file_path): |
128 | 134 | Context.g_module=Context.load_module(file_path) |
129 | 135 | Context.g_module.root_path=file_path |
131 | 137 | name=obj.__name__ |
132 | 138 | if not name in Context.g_module.__dict__: |
133 | 139 | setattr(Context.g_module,name,obj) |
134 | for k in(update,dist,distclean,distcheck): | |
140 | for k in(dist,distclean,distcheck): | |
135 | 141 | set_def(k) |
136 | 142 | if not'init'in Context.g_module.__dict__: |
137 | 143 | Context.g_module.init=Utils.nada |
172 | 178 | while Options.commands: |
173 | 179 | cmd_name=Options.commands.pop(0) |
174 | 180 | ctx=run_command(cmd_name) |
175 | Logs.info('%r finished successfully (%s)'%(cmd_name,str(ctx.log_timer))) | |
181 | Logs.info('%r finished successfully (%s)',cmd_name,ctx.log_timer) | |
176 | 182 | run_command('shutdown') |
177 | def _can_distclean(name): | |
178 | for k in'.o .moc .exe'.split(): | |
179 | if name.endswith(k): | |
180 | return True | |
181 | return False | |
182 | 183 | def distclean_dir(dirname): |
183 | 184 | for(root,dirs,files)in os.walk(dirname): |
184 | 185 | for f in files: |
185 | if _can_distclean(f): | |
186 | if f.endswith(('.o','.moc','.exe')): | |
186 | 187 | fname=os.path.join(root,f) |
187 | 188 | try: |
188 | 189 | os.remove(fname) |
189 | 190 | except OSError: |
190 | Logs.warn('Could not remove %r'%fname) | |
191 | Logs.warn('Could not remove %r',fname) | |
191 | 192 | for x in(Context.DBFILE,'config.log'): |
192 | 193 | try: |
193 | 194 | os.remove(x) |
205 | 206 | try: |
206 | 207 | proj=ConfigSet.ConfigSet(f) |
207 | 208 | except IOError: |
208 | Logs.warn('Could not read %r'%f) | |
209 | Logs.warn('Could not read %r',f) | |
209 | 210 | continue |
210 | 211 | if proj['out_dir']!=proj['top_dir']: |
211 | 212 | try: |
212 | 213 | shutil.rmtree(proj['out_dir']) |
213 | except IOError: | |
214 | pass | |
215 | except OSError as e: | |
214 | except EnvironmentError as e: | |
216 | 215 | if e.errno!=errno.ENOENT: |
217 | Logs.warn('Could not remove %r'%proj['out_dir']) | |
216 | Logs.warn('Could not remove %r',proj['out_dir']) | |
218 | 217 | else: |
219 | 218 | distclean_dir(proj['out_dir']) |
220 | 219 | for k in(proj['out_dir'],proj['top_dir'],proj['run_dir']): |
223 | 222 | os.remove(p) |
224 | 223 | except OSError as e: |
225 | 224 | if e.errno!=errno.ENOENT: |
226 | Logs.warn('Could not remove %r'%p) | |
225 | Logs.warn('Could not remove %r',p) | |
227 | 226 | if not Options.commands: |
228 | 227 | for x in'.waf-1. waf-1. .waf3-1. waf3-1.'.split(): |
229 | 228 | if f.startswith(x): |
251 | 250 | pass |
252 | 251 | files=self.get_files() |
253 | 252 | if self.algo.startswith('tar.'): |
254 | tar=tarfile.open(arch_name,'w:'+self.algo.replace('tar.','')) | |
253 | tar=tarfile.open(node.abspath(),'w:'+self.algo.replace('tar.','')) | |
255 | 254 | for x in files: |
256 | 255 | self.add_tar_file(x,tar) |
257 | 256 | tar.close() |
258 | 257 | elif self.algo=='zip': |
259 | 258 | import zipfile |
260 | zip=zipfile.ZipFile(arch_name,'w',compression=zipfile.ZIP_DEFLATED) | |
259 | zip=zipfile.ZipFile(node.abspath(),'w',compression=zipfile.ZIP_DEFLATED) | |
261 | 260 | for x in files: |
262 | 261 | archive_name=self.get_base_name()+'/'+x.path_from(self.base_path) |
263 | 262 | zip.write(x.abspath(),archive_name,zipfile.ZIP_DEFLATED) |
265 | 264 | else: |
266 | 265 | self.fatal('Valid algo types are tar.bz2, tar.gz, tar.xz or zip') |
267 | 266 | try: |
268 | from hashlib import sha1 as sha | |
267 | from hashlib import sha1 | |
269 | 268 | except ImportError: |
270 | from sha import sha | |
271 | try: | |
272 | digest=" (sha=%r)"%sha(node.read()).hexdigest() | |
273 | except Exception: | |
274 | 269 | digest='' |
275 | Logs.info('New archive created: %s%s'%(self.arch_name,digest)) | |
270 | else: | |
271 | digest=' (sha=%r)'%sha1(node.read(flags='rb')).hexdigest() | |
272 | Logs.info('New archive created: %s%s',self.arch_name,digest) | |
276 | 273 | def get_tar_path(self,node): |
277 | 274 | return node.abspath() |
278 | 275 | def add_tar_file(self,x,tar): |
282 | 279 | tinfo.gid=0 |
283 | 280 | tinfo.uname='root' |
284 | 281 | tinfo.gname='root' |
285 | fu=None | |
286 | try: | |
282 | if os.path.isfile(p): | |
287 | 283 | fu=open(p,'rb') |
288 | tar.addfile(tinfo,fileobj=fu) | |
289 | finally: | |
290 | if fu: | |
284 | try: | |
285 | tar.addfile(tinfo,fileobj=fu) | |
286 | finally: | |
291 | 287 | fu.close() |
288 | else: | |
289 | tar.addfile(tinfo) | |
292 | 290 | def get_tar_prefix(self): |
293 | 291 | try: |
294 | 292 | return self.tar_prefix |
336 | 334 | self.check() |
337 | 335 | def check(self): |
338 | 336 | import tempfile,tarfile |
339 | t=None | |
340 | 337 | try: |
341 | 338 | t=tarfile.open(self.get_arch_name()) |
342 | 339 | for x in t: |
343 | 340 | t.extract(x) |
344 | 341 | finally: |
345 | if t: | |
346 | t.close() | |
342 | t.close() | |
347 | 343 | cfg=[] |
348 | 344 | if Options.options.distcheck_args: |
349 | 345 | cfg=shlex.split(Options.options.distcheck_args) |
352 | 348 | instdir=tempfile.mkdtemp('.inst',self.get_base_name()) |
353 | 349 | ret=Utils.subprocess.Popen([sys.executable,sys.argv[0],'configure','install','uninstall','--destdir='+instdir]+cfg,cwd=self.get_base_name()).wait() |
354 | 350 | if ret: |
355 | raise Errors.WafError('distcheck failed with code %i'%ret) | |
351 | raise Errors.WafError('distcheck failed with code %r'%ret) | |
356 | 352 | if os.path.exists(instdir): |
357 | 353 | raise Errors.WafError('distcheck succeeded, but files were left in %s'%instdir) |
358 | 354 | shutil.rmtree(self.get_base_name()) |
359 | 355 | def distcheck(ctx): |
360 | 356 | '''checks if the project compiles (tarball from 'dist')''' |
361 | 357 | pass |
362 | def update(ctx): | |
363 | lst=Options.options.files | |
364 | if lst: | |
365 | lst=lst.split(',') | |
366 | else: | |
367 | path=os.path.join(Context.waf_dir,'waflib','extras') | |
368 | lst=[x for x in Utils.listdir(path)if x.endswith('.py')] | |
369 | for x in lst: | |
370 | tool=x.replace('.py','') | |
371 | if not tool: | |
372 | continue | |
373 | try: | |
374 | dl=Configure.download_tool | |
375 | except AttributeError: | |
376 | ctx.fatal('The command "update" is dangerous; include the tool "use_config" in your project!') | |
377 | try: | |
378 | dl(tool,force=True,ctx=ctx) | |
379 | except Errors.WafError: | |
380 | Logs.error('Could not find the tool %r in the remote repository'%x) | |
381 | else: | |
382 | Logs.warn('Updated %r'%tool) | |
383 | 358 | def autoconfigure(execute_method): |
384 | 359 | def execute(self): |
385 | 360 | if not Configure.autoconfig: |
388 | 363 | do_config=False |
389 | 364 | try: |
390 | 365 | env.load(os.path.join(Context.top_dir,Options.lockfile)) |
391 | except Exception: | |
366 | except EnvironmentError: | |
392 | 367 | Logs.warn('Configuring the project') |
393 | 368 | do_config=True |
394 | 369 | else: |
396 | 371 | do_config=True |
397 | 372 | else: |
398 | 373 | h=0 |
399 | for f in env['files']: | |
400 | h=Utils.h_list((h,Utils.readf(f,'rb'))) | |
401 | do_config=h!=env.hash | |
374 | for f in env.files: | |
375 | try: | |
376 | h=Utils.h_list((h,Utils.readf(f,'rb'))) | |
377 | except EnvironmentError: | |
378 | do_config=True | |
379 | break | |
380 | else: | |
381 | do_config=h!=env.hash | |
402 | 382 | if do_config: |
403 | cmd=env['config_cmd']or'configure' | |
383 | cmd=env.config_cmd or'configure' | |
404 | 384 | if Configure.autoconfig=='clobber': |
405 | 385 | tmp=Options.options.__dict__ |
406 | 386 | Options.options.__dict__=env.options |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import os,re,sys | |
4 | import os,re,sys,tempfile | |
5 | 5 | from waflib import Utils,Logs,Errors |
6 | 6 | NOT_RUN=0 |
7 | 7 | MISSING=1 |
17 | 17 | env = tsk.env |
18 | 18 | gen = tsk.generator |
19 | 19 | bld = gen.bld |
20 | cwdx = getattr(bld, 'cwdx', bld.bldnode) # TODO single cwd value in waf 1.9 | |
21 | wd = getattr(tsk, 'cwd', None) | |
20 | cwdx = tsk.get_cwd() | |
22 | 21 | p = env.get_flat |
23 | 22 | tsk.last_cmd = cmd = \'\'\' %s \'\'\' % s |
24 | return tsk.exec_command(cmd, cwd=wd, env=env.env or None) | |
23 | return tsk.exec_command(cmd, cwd=cwdx, env=env.env or None) | |
25 | 24 | ''' |
26 | 25 | COMPILE_TEMPLATE_NOSHELL=''' |
27 | 26 | def f(tsk): |
28 | 27 | env = tsk.env |
29 | 28 | gen = tsk.generator |
30 | 29 | bld = gen.bld |
31 | cwdx = getattr(bld, 'cwdx', bld.bldnode) # TODO single cwd value in waf 1.9 | |
32 | wd = getattr(tsk, 'cwd', None) | |
30 | cwdx = tsk.get_cwd() | |
33 | 31 | def to_list(xx): |
34 | 32 | if isinstance(xx, str): return [xx] |
35 | 33 | return xx |
36 | tsk.last_cmd = lst = [] | |
34 | def merge(lst1, lst2): | |
35 | if lst1 and lst2: | |
36 | return lst1[:-1] + [lst1[-1] + lst2[0]] + lst2[1:] | |
37 | return lst1 + lst2 | |
38 | lst = [] | |
37 | 39 | %s |
38 | lst = [x for x in lst if x] | |
39 | return tsk.exec_command(lst, cwd=wd, env=env.env or None) | |
40 | if '' in lst: | |
41 | lst = [x for x in lst if x] | |
42 | tsk.last_cmd = lst | |
43 | return tsk.exec_command(lst, cwd=cwdx, env=env.env or None) | |
40 | 44 | ''' |
41 | 45 | classes={} |
42 | 46 | class store_task_type(type): |
43 | 47 | def __init__(cls,name,bases,dict): |
44 | 48 | super(store_task_type,cls).__init__(name,bases,dict) |
45 | 49 | name=cls.__name__ |
46 | if name.endswith('_task'): | |
47 | name=name.replace('_task','') | |
48 | 50 | if name!='evil'and name!='TaskBase': |
49 | 51 | global classes |
50 | 52 | if getattr(cls,'run_str',None): |
66 | 68 | before=[] |
67 | 69 | after=[] |
68 | 70 | hcode='' |
71 | keep_last_cmd=False | |
72 | __slots__=('hasrun','generator') | |
69 | 73 | def __init__(self,*k,**kw): |
70 | 74 | self.hasrun=NOT_RUN |
71 | 75 | try: |
78 | 82 | if hasattr(self,'fun'): |
79 | 83 | return self.fun.__name__ |
80 | 84 | return self.__class__.__name__ |
81 | def __hash__(self): | |
82 | return id(self) | |
83 | 85 | def keyword(self): |
84 | 86 | if hasattr(self,'fun'): |
85 | 87 | return'Function' |
86 | 88 | return'Processing' |
89 | def get_cwd(self): | |
90 | bld=self.generator.bld | |
91 | ret=getattr(self,'cwd',None)or getattr(bld,'cwd',bld.bldnode) | |
92 | if isinstance(ret,str): | |
93 | if os.path.isabs(ret): | |
94 | ret=bld.root.make_node(ret) | |
95 | else: | |
96 | ret=self.generator.path.make_node(ret) | |
97 | return ret | |
98 | def quote_flag(self,x): | |
99 | old=x | |
100 | if'\\'in x: | |
101 | x=x.replace('\\','\\\\') | |
102 | if'"'in x: | |
103 | x=x.replace('"','\\"') | |
104 | if old!=x or' 'in x or'\t'in x or"'"in x: | |
105 | x='"%s"'%x | |
106 | return x | |
107 | def split_argfile(self,cmd): | |
108 | return([cmd[0]],[self.quote_flag(x)for x in cmd[1:]]) | |
87 | 109 | def exec_command(self,cmd,**kw): |
88 | bld=self.generator.bld | |
89 | try: | |
90 | if not kw.get('cwd',None): | |
91 | kw['cwd']=bld.cwd | |
92 | except AttributeError: | |
93 | bld.cwd=kw['cwd']=bld.variant_dir | |
94 | return bld.exec_command(cmd,**kw) | |
110 | if not'cwd'in kw: | |
111 | kw['cwd']=self.get_cwd() | |
112 | if hasattr(self,'timeout'): | |
113 | kw['timeout']=self.timeout | |
114 | if self.env.PATH: | |
115 | env=kw['env']=dict(kw.get('env')or self.env.env or os.environ) | |
116 | env['PATH']=self.env.PATH if isinstance(self.env.PATH,str)else os.pathsep.join(self.env.PATH) | |
117 | if not isinstance(cmd,str)and(len(repr(cmd))>=8192 if Utils.is_win32 else len(cmd)>200000): | |
118 | cmd,args=self.split_argfile(cmd) | |
119 | try: | |
120 | (fd,tmp)=tempfile.mkstemp() | |
121 | os.write(fd,'\r\n'.join(args).encode()) | |
122 | os.close(fd) | |
123 | if Logs.verbose: | |
124 | Logs.debug('argfile: @%r -> %r',tmp,args) | |
125 | return self.generator.bld.exec_command(cmd+['@'+tmp],**kw) | |
126 | finally: | |
127 | try: | |
128 | os.remove(tmp) | |
129 | except OSError: | |
130 | pass | |
131 | else: | |
132 | return self.generator.bld.exec_command(cmd,**kw) | |
95 | 133 | def runnable_status(self): |
96 | 134 | return RUN_ME |
135 | def uid(self): | |
136 | return Utils.SIG_NIL | |
97 | 137 | def process(self): |
98 | m=self.master | |
99 | if m.stop: | |
100 | m.out.put(self) | |
101 | return | |
138 | m=self.generator.bld.producer | |
102 | 139 | try: |
103 | 140 | del self.generator.bld.task_sigs[self.uid()] |
104 | 141 | except KeyError: |
105 | 142 | pass |
106 | 143 | try: |
107 | self.generator.bld.returned_tasks.append(self) | |
108 | self.log_display(self.generator.bld) | |
109 | 144 | ret=self.run() |
110 | 145 | except Exception: |
111 | 146 | self.err_msg=Utils.ex_stack() |
112 | 147 | self.hasrun=EXCEPTION |
113 | 148 | m.error_handler(self) |
114 | m.out.put(self) | |
115 | 149 | return |
116 | 150 | if ret: |
117 | 151 | self.err_code=ret |
128 | 162 | self.hasrun=SUCCESS |
129 | 163 | if self.hasrun!=SUCCESS: |
130 | 164 | m.error_handler(self) |
131 | m.out.put(self) | |
132 | 165 | def run(self): |
133 | 166 | if hasattr(self,'fun'): |
134 | 167 | return self.fun(self) |
153 | 186 | def display(self): |
154 | 187 | col1=Logs.colors(self.color) |
155 | 188 | col2=Logs.colors.NORMAL |
156 | master=self.master | |
189 | master=self.generator.bld.producer | |
157 | 190 | def cur(): |
158 | 191 | tmp=-1 |
159 | 192 | if hasattr(master,'ready'): |
182 | 215 | if kw: |
183 | 216 | kw+=' ' |
184 | 217 | return fs%(cur(),total,kw,col1,s,col2) |
185 | def attr(self,att,default=None): | |
186 | ret=getattr(self,att,self) | |
187 | if ret is self:return getattr(self.__class__,att,default) | |
188 | return ret | |
189 | 218 | def hash_constraints(self): |
190 | 219 | cls=self.__class__ |
191 | 220 | tup=(str(cls.before),str(cls.after),str(cls.ext_in),str(cls.ext_out),cls.__name__,cls.hcode) |
192 | h=hash(tup) | |
193 | return h | |
221 | return hash(tup) | |
194 | 222 | def format_error(self): |
195 | msg=getattr(self,'last_cmd','') | |
223 | if Logs.verbose: | |
224 | msg=': %r\n%r'%(self,getattr(self,'last_cmd','')) | |
225 | else: | |
226 | msg=' (run with -v to display more information)' | |
196 | 227 | name=getattr(self.generator,'name','') |
197 | 228 | if getattr(self,"err_msg",None): |
198 | 229 | return self.err_msg |
200 | 231 | return'task in %r was not executed for some reason: %r'%(name,self) |
201 | 232 | elif self.hasrun==CRASHED: |
202 | 233 | try: |
203 | return' -> task in %r failed (exit status %r): %r\n%r'%(name,self.err_code,self,msg) | |
234 | return' -> task in %r failed with exit status %r%s'%(name,self.err_code,msg) | |
204 | 235 | except AttributeError: |
205 | return' -> task in %r failed: %r\n%r'%(name,self,msg) | |
236 | return' -> task in %r failed%s'%(name,msg) | |
206 | 237 | elif self.hasrun==MISSING: |
207 | return' -> missing files in %r: %r\n%r'%(name,self,msg) | |
238 | return' -> missing files in %r%s'%(name,msg) | |
208 | 239 | else: |
209 | 240 | return'invalid status for task in %r: %r'%(name,self.hasrun) |
210 | 241 | def colon(self,var1,var2): |
225 | 256 | return lst |
226 | 257 | class Task(TaskBase): |
227 | 258 | vars=[] |
259 | always_run=False | |
228 | 260 | shell=False |
229 | 261 | def __init__(self,*k,**kw): |
230 | 262 | TaskBase.__init__(self,*k,**kw) |
232 | 264 | self.inputs=[] |
233 | 265 | self.outputs=[] |
234 | 266 | self.dep_nodes=[] |
235 | self.run_after=set([]) | |
267 | self.run_after=set() | |
236 | 268 | def __str__(self): |
237 | 269 | name=self.__class__.__name__ |
238 | 270 | if self.outputs: |
239 | if(name.endswith('lib')or name.endswith('program'))or not self.inputs: | |
271 | if name.endswith(('lib','program'))or not self.inputs: | |
240 | 272 | node=self.outputs[0] |
241 | 273 | return node.path_from(node.ctx.launch_node()) |
242 | 274 | if not(self.inputs or self.outputs): |
248 | 280 | tgt_str=' '.join([a.path_from(a.ctx.launch_node())for a in self.outputs]) |
249 | 281 | if self.outputs:sep=' -> ' |
250 | 282 | else:sep='' |
251 | return'%s: %s%s%s'%(self.__class__.__name__.replace('_task',''),src_str,sep,tgt_str) | |
283 | return'%s: %s%s%s'%(self.__class__.__name__,src_str,sep,tgt_str) | |
252 | 284 | def keyword(self): |
253 | 285 | name=self.__class__.__name__ |
254 | if name.endswith('lib')or name.endswith('program'): | |
286 | if name.endswith(('lib','program')): | |
255 | 287 | return'Linking' |
256 | 288 | if len(self.inputs)==1 and len(self.outputs)==1: |
257 | 289 | return'Compiling' |
273 | 305 | try: |
274 | 306 | return self.uid_ |
275 | 307 | except AttributeError: |
276 | m=Utils.md5() | |
308 | m=Utils.md5(self.__class__.__name__) | |
277 | 309 | up=m.update |
278 | up(self.__class__.__name__) | |
279 | 310 | for x in self.inputs+self.outputs: |
280 | 311 | up(x.abspath()) |
281 | 312 | self.uid_=m.digest() |
290 | 321 | assert isinstance(task,TaskBase) |
291 | 322 | self.run_after.add(task) |
292 | 323 | def signature(self): |
293 | try:return self.cache_sig | |
294 | except AttributeError:pass | |
295 | self.m=Utils.md5() | |
296 | self.m.update(self.hcode) | |
324 | try: | |
325 | return self.cache_sig | |
326 | except AttributeError: | |
327 | pass | |
328 | self.m=Utils.md5(self.hcode) | |
297 | 329 | self.sig_explicit_deps() |
298 | 330 | self.sig_vars() |
299 | 331 | if self.scan: |
307 | 339 | for t in self.run_after: |
308 | 340 | if not t.hasrun: |
309 | 341 | return ASK_LATER |
310 | bld=self.generator.bld | |
311 | 342 | try: |
312 | 343 | new_sig=self.signature() |
313 | 344 | except Errors.TaskNotReady: |
314 | 345 | return ASK_LATER |
346 | bld=self.generator.bld | |
315 | 347 | key=self.uid() |
316 | 348 | try: |
317 | 349 | prev_sig=bld.task_sigs[key] |
318 | 350 | except KeyError: |
319 | Logs.debug("task: task %r must run as it was never run before or the task code changed"%self) | |
351 | Logs.debug('task: task %r must run: it was never run before or the task code changed',self) | |
352 | return RUN_ME | |
353 | if new_sig!=prev_sig: | |
354 | Logs.debug('task: task %r must run: the task signature changed',self) | |
320 | 355 | return RUN_ME |
321 | 356 | for node in self.outputs: |
322 | try: | |
323 | if node.sig!=new_sig: | |
324 | return RUN_ME | |
325 | except AttributeError: | |
326 | Logs.debug("task: task %r must run as the output nodes do not exist"%self) | |
357 | sig=bld.node_sigs.get(node) | |
358 | if not sig: | |
359 | Logs.debug('task: task %r must run: an output node has no signature',self) | |
327 | 360 | return RUN_ME |
328 | if new_sig!=prev_sig: | |
329 | return RUN_ME | |
330 | return SKIP_ME | |
361 | if sig!=key: | |
362 | Logs.debug('task: task %r must run: an output node was produced by another task',self) | |
363 | return RUN_ME | |
364 | if not node.exists(): | |
365 | Logs.debug('task: task %r must run: an output node does not exist',self) | |
366 | return RUN_ME | |
367 | return(self.always_run and RUN_ME)or SKIP_ME | |
331 | 368 | def post_run(self): |
332 | 369 | bld=self.generator.bld |
333 | sig=self.signature() | |
334 | 370 | for node in self.outputs: |
335 | try: | |
336 | os.stat(node.abspath()) | |
337 | except OSError: | |
371 | if not node.exists(): | |
338 | 372 | self.hasrun=MISSING |
339 | 373 | self.err_msg='-> missing file: %r'%node.abspath() |
340 | 374 | raise Errors.WafError(self.err_msg) |
341 | node.sig=node.cache_sig=sig | |
342 | bld.task_sigs[self.uid()]=self.cache_sig | |
375 | bld.node_sigs[node]=self.uid() | |
376 | bld.task_sigs[self.uid()]=self.signature() | |
377 | if not self.keep_last_cmd: | |
378 | try: | |
379 | del self.last_cmd | |
380 | except AttributeError: | |
381 | pass | |
343 | 382 | def sig_explicit_deps(self): |
344 | 383 | bld=self.generator.bld |
345 | 384 | upd=self.m.update |
346 | 385 | for x in self.inputs+self.dep_nodes: |
347 | try: | |
348 | upd(x.get_bld_sig()) | |
349 | except(AttributeError,TypeError): | |
350 | raise Errors.WafError('Missing node signature for %r (required by %r)'%(x,self)) | |
386 | upd(x.get_bld_sig()) | |
351 | 387 | if bld.deps_man: |
352 | 388 | additional_deps=bld.deps_man |
353 | 389 | for x in self.inputs+self.outputs: |
354 | 390 | try: |
355 | d=additional_deps[id(x)] | |
391 | d=additional_deps[x] | |
356 | 392 | except KeyError: |
357 | 393 | continue |
358 | 394 | for v in d: |
359 | 395 | if isinstance(v,bld.root.__class__): |
360 | try: | |
361 | v=v.get_bld_sig() | |
362 | except AttributeError: | |
363 | raise Errors.WafError('Missing node signature for %r (required by %r)'%(v,self)) | |
396 | v=v.get_bld_sig() | |
364 | 397 | elif hasattr(v,'__call__'): |
365 | 398 | v=v() |
366 | 399 | upd(v) |
367 | return self.m.digest() | |
368 | 400 | def sig_vars(self): |
369 | bld=self.generator.bld | |
370 | env=self.env | |
371 | upd=self.m.update | |
372 | act_sig=bld.hash_env_vars(env,self.__class__.vars) | |
373 | upd(act_sig) | |
374 | dep_vars=getattr(self,'dep_vars',None) | |
375 | if dep_vars: | |
376 | upd(bld.hash_env_vars(env,dep_vars)) | |
377 | return self.m.digest() | |
401 | sig=self.generator.bld.hash_env_vars(self.env,self.__class__.vars) | |
402 | self.m.update(sig) | |
378 | 403 | scan=None |
379 | 404 | def sig_implicit_deps(self): |
380 | 405 | bld=self.generator.bld |
381 | 406 | key=self.uid() |
382 | prev=bld.task_sigs.get((key,'imp'),[]) | |
407 | prev=bld.imp_sigs.get(key,[]) | |
383 | 408 | if prev: |
384 | 409 | try: |
385 | 410 | if prev==self.compute_sig_implicit_deps(): |
388 | 413 | raise |
389 | 414 | except EnvironmentError: |
390 | 415 | for x in bld.node_deps.get(self.uid(),[]): |
391 | if not x.is_bld(): | |
416 | if not x.is_bld()and not x.exists(): | |
392 | 417 | try: |
393 | os.stat(x.abspath()) | |
394 | except OSError: | |
395 | try: | |
396 | del x.parent.children[x.name] | |
397 | except KeyError: | |
398 | pass | |
399 | del bld.task_sigs[(key,'imp')] | |
418 | del x.parent.children[x.name] | |
419 | except KeyError: | |
420 | pass | |
421 | del bld.imp_sigs[key] | |
400 | 422 | raise Errors.TaskRescan('rescan') |
401 | (nodes,names)=self.scan() | |
423 | (bld.node_deps[key],bld.raw_deps[key])=self.scan() | |
402 | 424 | if Logs.verbose: |
403 | Logs.debug('deps: scanner for %s returned %s %s'%(str(self),str(nodes),str(names))) | |
404 | bld.node_deps[key]=nodes | |
405 | bld.raw_deps[key]=names | |
406 | self.are_implicit_nodes_ready() | |
407 | try: | |
408 | bld.task_sigs[(key,'imp')]=sig=self.compute_sig_implicit_deps() | |
409 | except Exception: | |
410 | if Logs.verbose: | |
411 | for k in bld.node_deps.get(self.uid(),[]): | |
412 | try: | |
413 | k.get_bld_sig() | |
414 | except Exception: | |
415 | Logs.warn('Missing signature for node %r (may cause rebuilds)'%k) | |
416 | else: | |
417 | return sig | |
425 | Logs.debug('deps: scanner for %s: %r; unresolved: %r',self,bld.node_deps[key],bld.raw_deps[key]) | |
426 | try: | |
427 | bld.imp_sigs[key]=self.compute_sig_implicit_deps() | |
428 | except EnvironmentError: | |
429 | for k in bld.node_deps.get(self.uid(),[]): | |
430 | if not k.exists(): | |
431 | Logs.warn('Dependency %r for %r is missing: check the task declaration and the build order!',k,self) | |
432 | raise | |
418 | 433 | def compute_sig_implicit_deps(self): |
419 | 434 | upd=self.m.update |
420 | bld=self.generator.bld | |
421 | 435 | self.are_implicit_nodes_ready() |
422 | for k in bld.node_deps.get(self.uid(),[]): | |
436 | for k in self.generator.bld.node_deps.get(self.uid(),[]): | |
423 | 437 | upd(k.get_bld_sig()) |
424 | 438 | return self.m.digest() |
425 | 439 | def are_implicit_nodes_ready(self): |
449 | 463 | try: |
450 | 464 | return self.uid_ |
451 | 465 | except AttributeError: |
452 | m=Utils.md5() | |
466 | m=Utils.md5(self.__class__.__name__.encode('iso8859-1','xmlcharrefreplace')) | |
453 | 467 | up=m.update |
454 | up(self.__class__.__name__.encode('iso8859-1','xmlcharrefreplace')) | |
455 | 468 | for x in self.inputs+self.outputs: |
456 | 469 | up(x.abspath().encode('iso8859-1','xmlcharrefreplace')) |
457 | 470 | self.uid_=m.digest() |
506 | 519 | dc={} |
507 | 520 | exec(c,dc) |
508 | 521 | return dc['f'] |
509 | re_novar=re.compile(r"^(SRC|TGT)\W+.*?$") | |
510 | reg_act=re.compile(r"(?P<backslash>\\)|(?P<dollar>\$\$)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})",re.M) | |
522 | re_cond=re.compile('(?P<var>\w+)|(?P<or>\|)|(?P<and>&)') | |
523 | re_novar=re.compile(r'^(SRC|TGT)\W+.*?$') | |
524 | reg_act=re.compile(r'(?P<backslash>\\)|(?P<dollar>\$\$)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})',re.M) | |
511 | 525 | def compile_fun_shell(line): |
512 | 526 | extr=[] |
513 | 527 | def repl(match): |
514 | 528 | g=match.group |
515 | if g('dollar'):return"$" | |
516 | elif g('backslash'):return'\\\\' | |
517 | elif g('subst'):extr.append((g('var'),g('code')));return"%s" | |
529 | if g('dollar'): | |
530 | return"$" | |
531 | elif g('backslash'): | |
532 | return'\\\\' | |
533 | elif g('subst'): | |
534 | extr.append((g('var'),g('code'))) | |
535 | return"%s" | |
518 | 536 | return None |
519 | 537 | line=reg_act.sub(repl,line)or line |
538 | def replc(m): | |
539 | if m.group('and'): | |
540 | return' and ' | |
541 | elif m.group('or'): | |
542 | return' or ' | |
543 | else: | |
544 | x=m.group('var') | |
545 | if x not in dvars: | |
546 | dvars.append(x) | |
547 | return'env[%r]'%x | |
520 | 548 | parm=[] |
521 | 549 | dvars=[] |
522 | 550 | app=parm.append |
544 | 572 | dvars.append(meth[1:]) |
545 | 573 | m='%r'%m |
546 | 574 | app('" ".join(tsk.colon(%r, %s))'%(var,m)) |
575 | elif meth.startswith('?'): | |
576 | expr=re_cond.sub(replc,meth[1:]) | |
577 | app('p(%r) if (%s) else ""'%(var,expr)) | |
547 | 578 | else: |
548 | 579 | app('%s%s'%(var,meth)) |
549 | 580 | else: |
553 | 584 | if parm:parm="%% (%s) "%(',\n\t\t'.join(parm)) |
554 | 585 | else:parm='' |
555 | 586 | c=COMPILE_TEMPLATE_SHELL%(line,parm) |
556 | Logs.debug('action: %s'%c.strip().splitlines()) | |
587 | Logs.debug('action: %s',c.strip().splitlines()) | |
557 | 588 | return(funex(c),dvars) |
589 | reg_act_noshell=re.compile(r"(?P<space>\s+)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})|(?P<text>([^$ \t\n\r\f\v]|\$\$)+)",re.M) | |
558 | 590 | def compile_fun_noshell(line): |
559 | extr=[] | |
560 | def repl(match): | |
561 | g=match.group | |
562 | if g('dollar'):return"$" | |
563 | elif g('backslash'):return'\\' | |
564 | elif g('subst'):extr.append((g('var'),g('code')));return"<<|@|>>" | |
565 | return None | |
566 | line2=reg_act.sub(repl,line) | |
567 | params=line2.split('<<|@|>>') | |
568 | assert(extr) | |
569 | 591 | buf=[] |
570 | 592 | dvars=[] |
593 | merge=False | |
571 | 594 | app=buf.append |
572 | for x in range(len(extr)): | |
573 | params[x]=params[x].strip() | |
574 | if params[x]: | |
575 | app("lst.extend(%r)"%params[x].split()) | |
576 | (var,meth)=extr[x] | |
577 | if var=='SRC': | |
578 | if meth:app('lst.append(tsk.inputs%s)'%meth) | |
579 | else:app("lst.extend([a.path_from(cwdx) for a in tsk.inputs])") | |
580 | elif var=='TGT': | |
581 | if meth:app('lst.append(tsk.outputs%s)'%meth) | |
582 | else:app("lst.extend([a.path_from(cwdx) for a in tsk.outputs])") | |
583 | elif meth: | |
584 | if meth.startswith(':'): | |
595 | def replc(m): | |
596 | if m.group('and'): | |
597 | return' and ' | |
598 | elif m.group('or'): | |
599 | return' or ' | |
600 | else: | |
601 | x=m.group('var') | |
602 | if x not in dvars: | |
603 | dvars.append(x) | |
604 | return'env[%r]'%x | |
605 | for m in reg_act_noshell.finditer(line): | |
606 | if m.group('space'): | |
607 | merge=False | |
608 | continue | |
609 | elif m.group('text'): | |
610 | app('[%r]'%m.group('text').replace('$$','$')) | |
611 | elif m.group('subst'): | |
612 | var=m.group('var') | |
613 | code=m.group('code') | |
614 | if var=='SRC': | |
615 | if code: | |
616 | app('[tsk.inputs%s]'%code) | |
617 | else: | |
618 | app('[a.path_from(cwdx) for a in tsk.inputs]') | |
619 | elif var=='TGT': | |
620 | if code: | |
621 | app('[tsk.outputs%s]'%code) | |
622 | else: | |
623 | app('[a.path_from(cwdx) for a in tsk.outputs]') | |
624 | elif code: | |
625 | if code.startswith(':'): | |
626 | if not var in dvars: | |
627 | dvars.append(var) | |
628 | m=code[1:] | |
629 | if m=='SRC': | |
630 | m='[a.path_from(cwdx) for a in tsk.inputs]' | |
631 | elif m=='TGT': | |
632 | m='[a.path_from(cwdx) for a in tsk.outputs]' | |
633 | elif re_novar.match(m): | |
634 | m='[tsk.inputs%s]'%m[3:] | |
635 | elif re_novar.match(m): | |
636 | m='[tsk.outputs%s]'%m[3:] | |
637 | elif m[:3]not in('tsk','gen','bld'): | |
638 | dvars.append(m) | |
639 | m='%r'%m | |
640 | app('tsk.colon(%r, %s)'%(var,m)) | |
641 | elif code.startswith('?'): | |
642 | expr=re_cond.sub(replc,code[1:]) | |
643 | app('to_list(env[%r] if (%s) else [])'%(var,expr)) | |
644 | else: | |
645 | app('gen.to_list(%s%s)'%(var,code)) | |
646 | else: | |
647 | app('to_list(env[%r])'%var) | |
585 | 648 | if not var in dvars: |
586 | 649 | dvars.append(var) |
587 | m=meth[1:] | |
588 | if m=='SRC': | |
589 | m='[a.path_from(cwdx) for a in tsk.inputs]' | |
590 | elif m=='TGT': | |
591 | m='[a.path_from(cwdx) for a in tsk.outputs]' | |
592 | elif re_novar.match(m): | |
593 | m='[tsk.inputs%s]'%m[3:] | |
594 | elif re_novar.match(m): | |
595 | m='[tsk.outputs%s]'%m[3:] | |
596 | elif m[:3]not in('tsk','gen','bld'): | |
597 | dvars.append(m) | |
598 | m='%r'%m | |
599 | app('lst.extend(tsk.colon(%r, %s))'%(var,m)) | |
600 | else: | |
601 | app('lst.extend(gen.to_list(%s%s))'%(var,meth)) | |
602 | else: | |
603 | app('lst.extend(to_list(env[%r]))'%var) | |
604 | if not var in dvars: | |
605 | dvars.append(var) | |
606 | if extr: | |
607 | if params[-1]: | |
608 | app("lst.extend(%r)"%params[-1].split()) | |
650 | if merge: | |
651 | tmp='merge(%s, %s)'%(buf[-2],buf[-1]) | |
652 | del buf[-1] | |
653 | buf[-1]=tmp | |
654 | merge=True | |
655 | buf=['lst.extend(%s)'%x for x in buf] | |
609 | 656 | fun=COMPILE_TEMPLATE_NOSHELL%"\n\t".join(buf) |
610 | Logs.debug('action: %s'%fun.strip().splitlines()) | |
657 | Logs.debug('action: %s',fun.strip().splitlines()) | |
611 | 658 | return(funex(fun),dvars) |
612 | 659 | def compile_fun(line,shell=False): |
613 | 660 | if isinstance(line,str): |
645 | 692 | classes[name]=cls |
646 | 693 | return cls |
647 | 694 | def always_run(cls): |
648 | old=cls.runnable_status | |
649 | def always(self): | |
650 | ret=old(self) | |
651 | if ret==SKIP_ME: | |
652 | ret=RUN_ME | |
653 | return ret | |
654 | cls.runnable_status=always | |
695 | Logs.warn('This decorator is deprecated, set always_run on the task class instead!') | |
696 | cls.always_run=True | |
655 | 697 | return cls |
656 | 698 | def update_outputs(cls): |
657 | old_post_run=cls.post_run | |
658 | def post_run(self): | |
659 | old_post_run(self) | |
660 | for node in self.outputs: | |
661 | node.sig=node.cache_sig=Utils.h_file(node.abspath()) | |
662 | self.generator.bld.task_sigs[node.abspath()]=self.uid() | |
663 | cls.post_run=post_run | |
664 | old_runnable_status=cls.runnable_status | |
665 | def runnable_status(self): | |
666 | status=old_runnable_status(self) | |
667 | if status!=RUN_ME: | |
668 | return status | |
669 | try: | |
670 | bld=self.generator.bld | |
671 | prev_sig=bld.task_sigs[self.uid()] | |
672 | if prev_sig==self.signature(): | |
673 | for x in self.outputs: | |
674 | if not x.is_child_of(bld.bldnode): | |
675 | x.sig=Utils.h_file(x.abspath()) | |
676 | if not x.sig or bld.task_sigs[x.abspath()]!=self.uid(): | |
677 | return RUN_ME | |
678 | return SKIP_ME | |
679 | except OSError: | |
680 | pass | |
681 | except IOError: | |
682 | pass | |
683 | except KeyError: | |
684 | pass | |
685 | except IndexError: | |
686 | pass | |
687 | except AttributeError: | |
688 | pass | |
689 | return RUN_ME | |
690 | cls.runnable_status=runnable_status | |
691 | 699 | return cls |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import copy,re,os | |
4 | import copy,re,os,functools | |
5 | 5 | from waflib import Task,Utils,Logs,Errors,ConfigSet,Node |
6 | 6 | feats=Utils.defaultdict(set) |
7 | 7 | HEADER_EXTS=['.h','.hpp','.hxx','.hh'] |
12 | 12 | self.source='' |
13 | 13 | self.target='' |
14 | 14 | self.meths=[] |
15 | self.prec=Utils.defaultdict(list) | |
16 | self.mappings={} | |
17 | 15 | self.features=[] |
18 | 16 | self.tasks=[] |
19 | 17 | if not'bld'in kw: |
25 | 23 | self.env=self.bld.env.derive() |
26 | 24 | self.path=self.bld.path |
27 | 25 | try: |
28 | self.idx=self.bld.idx[id(self.path)]=self.bld.idx.get(id(self.path),0)+1 | |
26 | self.idx=self.bld.idx[self.path]=self.bld.idx.get(self.path,0)+1 | |
29 | 27 | except AttributeError: |
30 | 28 | self.bld.idx={} |
31 | self.idx=self.bld.idx[id(self.path)]=1 | |
29 | self.idx=self.bld.idx[self.path]=1 | |
32 | 30 | for key,val in kw.items(): |
33 | 31 | setattr(self,key,val) |
34 | 32 | def __str__(self): |
35 | 33 | return"<task_gen %r declared in %s>"%(self.name,self.path.abspath()) |
36 | 34 | def __repr__(self): |
37 | 35 | lst=[] |
38 | for x in self.__dict__.keys(): | |
36 | for x in self.__dict__: | |
39 | 37 | if x not in('env','bld','compiled_tasks','tasks'): |
40 | 38 | lst.append("%s=%s"%(x,repr(getattr(self,x)))) |
41 | 39 | return"bld(%s) in %s"%(", ".join(lst),self.path.abspath()) |
40 | def get_cwd(self): | |
41 | return self.bld.bldnode | |
42 | 42 | def get_name(self): |
43 | 43 | try: |
44 | 44 | return self._name |
53 | 53 | self._name=name |
54 | 54 | name=property(get_name,set_name) |
55 | 55 | def to_list(self,val): |
56 | if isinstance(val,str):return val.split() | |
57 | else:return val | |
56 | if isinstance(val,str): | |
57 | return val.split() | |
58 | else: | |
59 | return val | |
58 | 60 | def post(self): |
59 | 61 | if getattr(self,'posted',None): |
60 | 62 | return False |
61 | 63 | self.posted=True |
62 | 64 | keys=set(self.meths) |
65 | keys.update(feats['*']) | |
63 | 66 | self.features=Utils.to_list(self.features) |
64 | for x in self.features+['*']: | |
67 | for x in self.features: | |
65 | 68 | st=feats[x] |
66 | if not st: | |
67 | if not x in Task.classes: | |
68 | Logs.warn('feature %r does not exist - bind at least one method to it'%x) | |
69 | keys.update(list(st)) | |
69 | if st: | |
70 | keys.update(st) | |
71 | elif not x in Task.classes: | |
72 | Logs.warn('feature %r does not exist - bind at least one method to it?',x) | |
70 | 73 | prec={} |
71 | prec_tbl=self.prec or task_gen.prec | |
74 | prec_tbl=self.prec | |
72 | 75 | for x in prec_tbl: |
73 | 76 | if x in keys: |
74 | 77 | prec[x]=prec_tbl[x] |
82 | 85 | out=[] |
83 | 86 | while tmp: |
84 | 87 | e=tmp.pop() |
85 | if e in keys:out.append(e) | |
88 | if e in keys: | |
89 | out.append(e) | |
86 | 90 | try: |
87 | 91 | nlst=prec[e] |
88 | 92 | except KeyError: |
96 | 100 | else: |
97 | 101 | tmp.append(x) |
98 | 102 | if prec: |
99 | txt='\n'.join(['- %s after %s'%(k,repr(v))for k,v in prec.items()]) | |
100 | raise Errors.WafError('Cycle detected in the method execution\n%s'%txt) | |
103 | buf=['Cycle detected in the method execution:'] | |
104 | for k,v in prec.items(): | |
105 | buf.append('- %s after %s'%(k,[x for x in v if x in prec])) | |
106 | raise Errors.WafError('\n'.join(buf)) | |
101 | 107 | out.reverse() |
102 | 108 | self.meths=out |
103 | Logs.debug('task_gen: posting %s %d'%(self,id(self))) | |
109 | Logs.debug('task_gen: posting %s %d',self,id(self)) | |
104 | 110 | for x in out: |
105 | 111 | try: |
106 | 112 | v=getattr(self,x) |
107 | 113 | except AttributeError: |
108 | 114 | raise Errors.WafError('%r is not a valid task generator method'%x) |
109 | Logs.debug('task_gen: -> %s (%d)'%(x,id(self))) | |
115 | Logs.debug('task_gen: -> %s (%d)',x,id(self)) | |
110 | 116 | v() |
111 | Logs.debug('task_gen: posted %s'%self.name) | |
117 | Logs.debug('task_gen: posted %s',self.name) | |
112 | 118 | return True |
113 | 119 | def get_hook(self,node): |
114 | 120 | name=node.name |
115 | if self.mappings: | |
116 | for k in self.mappings: | |
121 | for k in self.mappings: | |
122 | try: | |
117 | 123 | if name.endswith(k): |
118 | 124 | return self.mappings[k] |
119 | for k in task_gen.mappings: | |
120 | if name.endswith(k): | |
121 | return task_gen.mappings[k] | |
122 | raise Errors.WafError("File %r has no mapping in %r (have you forgotten to load a waf tool?)"%(node,task_gen.mappings.keys())) | |
125 | except TypeError: | |
126 | if k.match(name): | |
127 | return self.mappings[k] | |
128 | keys=list(self.mappings.keys()) | |
129 | raise Errors.WafError("File %r has no mapping in %r (load a waf tool?)"%(node,keys)) | |
123 | 130 | def create_task(self,name,src=None,tgt=None,**kw): |
124 | 131 | task=Task.classes[name](env=self.env.derive(),generator=self) |
125 | 132 | if src: |
151 | 158 | name=rule |
152 | 159 | cls=Task.task_factory(name,rule,color=color,ext_in=ext_in,ext_out=ext_out,before=before,after=after,scan=scan,shell=shell) |
153 | 160 | def x_file(self,node): |
154 | ext=decider and decider(self,node)or cls.ext_out | |
155 | 161 | if ext_in: |
156 | 162 | _ext_in=ext_in[0] |
157 | 163 | tsk=self.create_task(name,node) |
158 | 164 | cnt=0 |
159 | keys=set(self.mappings.keys())|set(self.__class__.mappings.keys()) | |
165 | ext=decider(self,node)if decider else cls.ext_out | |
160 | 166 | for x in ext: |
161 | 167 | k=node.change_ext(x,ext_in=_ext_in) |
162 | 168 | tsk.outputs.append(k) |
164 | 170 | if cnt<int(reentrant): |
165 | 171 | self.source.append(k) |
166 | 172 | else: |
167 | for y in keys: | |
173 | for y in self.mappings: | |
168 | 174 | if k.name.endswith(y): |
169 | 175 | self.source.append(k) |
170 | 176 | break |
171 | 177 | cnt+=1 |
172 | 178 | if install_path: |
173 | self.bld.install_files(install_path,tsk.outputs) | |
179 | self.install_task=self.add_install_files(install_to=install_path,install_from=tsk.outputs) | |
174 | 180 | return tsk |
175 | 181 | for x in cls.ext_in: |
176 | 182 | task_gen.mappings[x]=x_file |
267 | 273 | nodes.append(node) |
268 | 274 | return[nodes,[]] |
269 | 275 | cls.scan=scan |
270 | if getattr(self,'update_outputs',None): | |
271 | Task.update_outputs(cls) | |
272 | 276 | if getattr(self,'always',None): |
273 | Task.always_run(cls) | |
277 | cls.always_run=True | |
278 | if getattr(self,'timeout',None): | |
279 | cls.timeout=self.timeout | |
274 | 280 | for x in('after','before','ext_in','ext_out'): |
275 | 281 | setattr(cls,x,getattr(self,x,[])) |
276 | 282 | if getattr(self,'cache_rule','True'): |
292 | 298 | x.parent.mkdir() |
293 | 299 | tsk.outputs.append(x) |
294 | 300 | if getattr(self,'install_path',None): |
295 | self.bld.install_files(self.install_path,tsk.outputs,chmod=getattr(self,'chmod',Utils.O644)) | |
301 | self.install_task=self.add_install_files(install_to=self.install_path,install_from=tsk.outputs,chmod=getattr(self,'chmod',Utils.O644)) | |
296 | 302 | if getattr(self,'source',None): |
297 | 303 | tsk.inputs=self.to_nodes(self.source) |
298 | 304 | self.source=[] |
299 | 305 | if getattr(self,'cwd',None): |
300 | 306 | tsk.cwd=self.cwd |
307 | if isinstance(tsk.run,functools.partial): | |
308 | tsk.run=functools.partial(tsk.run,tsk) | |
301 | 309 | @feature('seq') |
302 | 310 | def sequence_order(self): |
303 | 311 | if self.meths and self.meths[-1]!='sequence_order': |
358 | 366 | d[x]=tmp |
359 | 367 | code=code%d |
360 | 368 | self.outputs[0].write(code,encoding=getattr(self.generator,'encoding','ISO8859-1')) |
361 | self.generator.bld.raw_deps[self.uid()]=self.dep_vars=lst | |
369 | self.generator.bld.raw_deps[self.uid()]=lst | |
362 | 370 | try:delattr(self,'cache_sig') |
363 | 371 | except AttributeError:pass |
364 | 372 | self.force_permissions() |
379 | 387 | @extension('.pc.in') |
380 | 388 | def add_pcfile(self,node): |
381 | 389 | tsk=self.create_task('subst_pc',node,node.change_ext('.pc','.pc.in')) |
382 | self.bld.install_files(getattr(self,'install_path','${LIBDIR}/pkgconfig/'),tsk.outputs) | |
390 | self.install_task=self.add_install_files(install_to=getattr(self,'install_path','${LIBDIR}/pkgconfig/'),install_from=tsk.outputs) | |
383 | 391 | class subst(subst_pc): |
384 | 392 | pass |
385 | 393 | @feature('subst') |
401 | 409 | a=self.path.find_node(x) |
402 | 410 | b=self.path.get_bld().make_node(y) |
403 | 411 | if not os.path.isfile(b.abspath()): |
404 | b.sig=None | |
405 | 412 | b.parent.mkdir() |
406 | 413 | else: |
407 | 414 | if isinstance(x,str): |
429 | 436 | break |
430 | 437 | inst_to=getattr(self,'install_path',None) |
431 | 438 | if inst_to: |
432 | self.bld.install_files(inst_to,b,chmod=getattr(self,'chmod',Utils.O644)) | |
439 | self.install_task=self.add_install_files(install_to=inst_to,install_from=b,chmod=getattr(self,'chmod',Utils.O644)) | |
433 | 440 | self.source=[] |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | 4 | from waflib import Task |
5 | import waflib.Task | |
6 | 5 | from waflib.Tools.ccroot import link_task,stlink_task |
7 | 6 | from waflib.TaskGen import extension |
8 | 7 | class asm(Task.Task): |
20 | 19 | class asmstlib(stlink_task): |
21 | 20 | pass |
22 | 21 | def configure(conf): |
23 | conf.env['ASMPATH_ST']='-I%s' | |
22 | conf.env.ASMPATH_ST='-I%s' |
9 | 9 | ext_out=['.h'] |
10 | 10 | @extension('.y','.yc','.yy') |
11 | 11 | def big_bison(self,node): |
12 | has_h='-d'in self.env['BISONFLAGS'] | |
12 | has_h='-d'in self.env.BISONFLAGS | |
13 | 13 | outs=[] |
14 | 14 | if node.name.endswith('.yc'): |
15 | 15 | outs.append(node.change_ext('.tab.cc')) |
20 | 20 | if has_h: |
21 | 21 | outs.append(node.change_ext('.tab.h')) |
22 | 22 | tsk=self.create_task('bison',node,outs) |
23 | tsk.cwd=node.parent.get_bld().abspath() | |
23 | tsk.cwd=node.parent.get_bld() | |
24 | 24 | self.source.append(outs[0]) |
25 | 25 | def configure(conf): |
26 | 26 | conf.find_program('bison',var='BISON') |
10 | 10 | return self.create_compiled_task('cxx',node) |
11 | 11 | return self.create_compiled_task('c',node) |
12 | 12 | class c(Task.Task): |
13 | run_str='${CC} ${ARCH_ST:ARCH} ${CFLAGS} ${CPPFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CC_SRC_F}${SRC} ${CC_TGT_F}${TGT[0].abspath()}' | |
13 | run_str='${CC} ${ARCH_ST:ARCH} ${CFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CC_SRC_F}${SRC} ${CC_TGT_F}${TGT[0].abspath()} ${CPPFLAGS}' | |
14 | 14 | vars=['CCDEPS'] |
15 | 15 | ext_in=['.h'] |
16 | 16 | scan=c_preproc.scan |
6 | 6 | def get_extensions(lst): |
7 | 7 | ret=[] |
8 | 8 | for x in Utils.to_list(lst): |
9 | try: | |
10 | if not isinstance(x,str): | |
11 | x=x.name | |
12 | ret.append(x[x.rfind('.')+1:]) | |
13 | except Exception: | |
14 | pass | |
9 | if not isinstance(x,str): | |
10 | x=x.name | |
11 | ret.append(x[x.rfind('.')+1:]) | |
15 | 12 | return ret |
16 | 13 | def sniff_features(**kw): |
17 | 14 | exts=get_extensions(kw['source']) |
18 | type=kw['_type'] | |
15 | typ=kw['typ'] | |
19 | 16 | feats=[] |
20 | 17 | for x in'cxx cpp c++ cc C'.split(): |
21 | 18 | if x in exts: |
32 | 29 | if'java'in exts: |
33 | 30 | feats.append('java') |
34 | 31 | return'java' |
35 | if type in('program','shlib','stlib'): | |
32 | if typ in('program','shlib','stlib'): | |
36 | 33 | will_link=False |
37 | 34 | for x in feats: |
38 | 35 | if x in('cxx','d','fc','c'): |
39 | feats.append(x+type) | |
36 | feats.append(x+typ) | |
40 | 37 | will_link=True |
41 | 38 | if not will_link and not kw.get('features',[]): |
42 | 39 | raise Errors.WafError('Cannot link from %r, try passing eg: features="c cprogram"?'%kw) |
43 | 40 | return feats |
44 | def set_features(kw,_type): | |
45 | kw['_type']=_type | |
41 | def set_features(kw,typ): | |
42 | kw['typ']=typ | |
46 | 43 | kw['features']=Utils.to_list(kw.get('features',[]))+Utils.to_list(sniff_features(**kw)) |
47 | 44 | @conf |
48 | 45 | def program(bld,*k,**kw): |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | from __future__ import with_statement | |
4 | 5 | import os,re,shlex |
5 | 6 | from waflib import Build,Utils,Task,Options,Logs,Errors,Runner |
6 | 7 | from waflib.TaskGen import after_method,feature |
53 | 54 | lex.whitespace_split=True |
54 | 55 | lex.commenters='' |
55 | 56 | lst=list(lex) |
56 | app=env.append_value | |
57 | appu=env.append_unique | |
58 | 57 | uselib=uselib_store |
58 | def app(var,val): | |
59 | env.append_value('%s_%s'%(var,uselib),val) | |
60 | def appu(var,val): | |
61 | env.append_unique('%s_%s'%(var,uselib),val) | |
59 | 62 | static=False |
60 | 63 | while lst: |
61 | 64 | x=lst.pop(0) |
62 | 65 | st=x[:2] |
63 | 66 | ot=x[2:] |
64 | 67 | if st=='-I'or st=='/I': |
65 | if not ot:ot=lst.pop(0) | |
66 | appu('INCLUDES_'+uselib,[ot]) | |
68 | if not ot: | |
69 | ot=lst.pop(0) | |
70 | appu('INCLUDES',ot) | |
67 | 71 | elif st=='-i': |
68 | 72 | tmp=[x,lst.pop(0)] |
69 | 73 | app('CFLAGS',tmp) |
70 | 74 | app('CXXFLAGS',tmp) |
71 | 75 | elif st=='-D'or(env.CXX_NAME=='msvc'and st=='/D'): |
72 | if not ot:ot=lst.pop(0) | |
73 | app('DEFINES_'+uselib,[ot]) | |
76 | if not ot: | |
77 | ot=lst.pop(0) | |
78 | app('DEFINES',ot) | |
74 | 79 | elif st=='-l': |
75 | if not ot:ot=lst.pop(0) | |
76 | prefix=(force_static or static)and'STLIB_'or'LIB_' | |
77 | appu(prefix+uselib,[ot]) | |
80 | if not ot: | |
81 | ot=lst.pop(0) | |
82 | prefix='STLIB'if(force_static or static)else'LIB' | |
83 | app(prefix,ot) | |
78 | 84 | elif st=='-L': |
79 | if not ot:ot=lst.pop(0) | |
80 | prefix=(force_static or static)and'STLIBPATH_'or'LIBPATH_' | |
81 | appu(prefix+uselib,[ot]) | |
85 | if not ot: | |
86 | ot=lst.pop(0) | |
87 | prefix='STLIBPATH'if(force_static or static)else'LIBPATH' | |
88 | appu(prefix,ot) | |
82 | 89 | elif x.startswith('/LIBPATH:'): |
83 | prefix=(force_static or static)and'STLIBPATH_'or'LIBPATH_' | |
84 | appu(prefix+uselib,[x.replace('/LIBPATH:','')]) | |
90 | prefix='STLIBPATH'if(force_static or static)else'LIBPATH' | |
91 | appu(prefix,x.replace('/LIBPATH:','')) | |
85 | 92 | elif x.startswith('-std='): |
86 | if'++'in x: | |
87 | app('CXXFLAGS_'+uselib,[x]) | |
88 | else: | |
89 | app('CFLAGS_'+uselib,[x]) | |
93 | prefix='CXXFLAGS'if'++'in x else'CFLAGS' | |
94 | app(prefix,x) | |
90 | 95 | elif x=='-pthread'or x.startswith('+'): |
91 | app('CFLAGS_'+uselib,[x]) | |
92 | app('CXXFLAGS_'+uselib,[x]) | |
93 | app('LINKFLAGS_'+uselib,[x]) | |
96 | app('CFLAGS',x) | |
97 | app('CXXFLAGS',x) | |
98 | app('LINKFLAGS',x) | |
94 | 99 | elif x=='-framework': |
95 | appu('FRAMEWORK_'+uselib,[lst.pop(0)]) | |
100 | appu('FRAMEWORK',lst.pop(0)) | |
96 | 101 | elif x.startswith('-F'): |
97 | appu('FRAMEWORKPATH_'+uselib,[x[2:]]) | |
102 | appu('FRAMEWORKPATH',x[2:]) | |
98 | 103 | elif x=='-Wl,-rpath'or x=='-Wl,-R': |
99 | app('RPATH_'+uselib,lst.pop(0).lstrip('-Wl,')) | |
104 | app('RPATH',lst.pop(0).lstrip('-Wl,')) | |
100 | 105 | elif x.startswith('-Wl,-R,'): |
101 | app('RPATH_'+uselib,x[7:]) | |
106 | app('RPATH',x[7:]) | |
102 | 107 | elif x.startswith('-Wl,-R'): |
103 | app('RPATH_'+uselib,x[6:]) | |
108 | app('RPATH',x[6:]) | |
104 | 109 | elif x.startswith('-Wl,-rpath,'): |
105 | app('RPATH_'+uselib,x[11:]) | |
110 | app('RPATH',x[11:]) | |
106 | 111 | elif x=='-Wl,-Bstatic'or x=='-Bstatic': |
107 | 112 | static=True |
108 | 113 | elif x=='-Wl,-Bdynamic'or x=='-Bdynamic': |
109 | 114 | static=False |
110 | 115 | elif x.startswith('-Wl'): |
111 | app('LINKFLAGS_'+uselib,[x]) | |
112 | elif x.startswith('-m')or x.startswith('-f')or x.startswith('-dynamic'): | |
113 | app('CFLAGS_'+uselib,[x]) | |
114 | app('CXXFLAGS_'+uselib,[x]) | |
116 | app('LINKFLAGS',x) | |
117 | elif x.startswith(('-m','-f','-dynamic')): | |
118 | app('CFLAGS',x) | |
119 | app('CXXFLAGS',x) | |
115 | 120 | elif x.startswith('-bundle'): |
116 | app('LINKFLAGS_'+uselib,[x]) | |
117 | elif x.startswith('-undefined')or x.startswith('-Xlinker'): | |
121 | app('LINKFLAGS',x) | |
122 | elif x.startswith(('-undefined','-Xlinker')): | |
118 | 123 | arg=lst.pop(0) |
119 | app('LINKFLAGS_'+uselib,[x,arg]) | |
120 | elif x.startswith('-arch')or x.startswith('-isysroot'): | |
124 | app('LINKFLAGS',[x,arg]) | |
125 | elif x.startswith(('-arch','-isysroot')): | |
121 | 126 | tmp=[x,lst.pop(0)] |
122 | app('CFLAGS_'+uselib,tmp) | |
123 | app('CXXFLAGS_'+uselib,tmp) | |
124 | app('LINKFLAGS_'+uselib,tmp) | |
125 | elif x.endswith('.a')or x.endswith('.so')or x.endswith('.dylib')or x.endswith('.lib'): | |
126 | appu('LINKFLAGS_'+uselib,[x]) | |
127 | app('CFLAGS',tmp) | |
128 | app('CXXFLAGS',tmp) | |
129 | app('LINKFLAGS',tmp) | |
130 | elif x.endswith(('.a','.so','.dylib','.lib')): | |
131 | appu('LINKFLAGS',x) | |
127 | 132 | @conf |
128 | 133 | def validate_cfg(self,kw): |
129 | 134 | if not'path'in kw: |
141 | 146 | if'modversion'in kw: |
142 | 147 | if not'msg'in kw: |
143 | 148 | kw['msg']='Checking for %r version'%kw['modversion'] |
149 | if not'uselib_store'in kw: | |
150 | kw['uselib_store']=kw['modversion'] | |
151 | if not'define_name'in kw: | |
152 | kw['define_name']='%s_VERSION'%Utils.quote_define_name(kw['uselib_store']) | |
144 | 153 | return |
145 | for x in cfg_ver.keys(): | |
154 | if not'package'in kw: | |
155 | raise ValueError('a package name is required') | |
156 | if not'uselib_store'in kw: | |
157 | kw['uselib_store']=kw['package'].upper() | |
158 | if not'define_name'in kw: | |
159 | kw['define_name']=self.have_define(kw['uselib_store']) | |
160 | if not'msg'in kw: | |
161 | kw['msg']='Checking for %r'%(kw['package']or kw['path']) | |
162 | for x in cfg_ver: | |
146 | 163 | y=x.replace('-','_') |
147 | 164 | if y in kw: |
148 | if not'package'in kw: | |
149 | raise ValueError('%s requires a package'%x) | |
165 | package=kw['package'] | |
166 | if Logs.verbose: | |
167 | Logs.warn('Passing %r to conf.check_cfg() is obsolete, pass parameters directly, eg:',y) | |
168 | Logs.warn(" conf.check_cfg(package='%s', args=['--libs', '--cflags', '%s >= 1.6'])",package,package) | |
150 | 169 | if not'msg'in kw: |
151 | kw['msg']='Checking for %r %s %s'%(kw['package'],cfg_ver[x],kw[y]) | |
152 | return | |
153 | if not'define_name'in kw: | |
154 | pkgname=kw.get('uselib_store',kw['package'].upper()) | |
155 | kw['define_name']=self.have_define(pkgname) | |
156 | if not'uselib_store'in kw: | |
157 | self.undefine(kw['define_name']) | |
158 | if not'msg'in kw: | |
159 | kw['msg']='Checking for %r'%(kw['package']or kw['path']) | |
170 | kw['msg']='Checking for %r %s %s'%(package,cfg_ver[x],kw[y]) | |
171 | break | |
160 | 172 | @conf |
161 | 173 | def exec_cfg(self,kw): |
162 | 174 | path=Utils.to_list(kw['path']) |
163 | 175 | env=self.env.env or None |
176 | if kw.get('pkg_config_path'): | |
177 | if not env: | |
178 | env=dict(self.environ) | |
179 | env['PKG_CONFIG_PATH']=kw['pkg_config_path'] | |
164 | 180 | def define_it(): |
165 | pkgname=kw.get('uselib_store',kw['package'].upper()) | |
166 | if kw.get('global_define'): | |
167 | self.define(self.have_define(kw['package']),1,False) | |
168 | else: | |
169 | self.env.append_unique('DEFINES_%s'%pkgname,"%s=1"%self.have_define(pkgname)) | |
170 | self.env[self.have_define(pkgname)]=1 | |
181 | define_name=kw['define_name'] | |
182 | if kw.get('global_define',1): | |
183 | self.define(define_name,1,False) | |
184 | else: | |
185 | self.env.append_unique('DEFINES_%s'%kw['uselib_store'],"%s=1"%define_name) | |
186 | if kw.get('add_have_to_env',1): | |
187 | self.env[define_name]=1 | |
171 | 188 | if'atleast_pkgconfig_version'in kw: |
172 | 189 | cmd=path+['--atleast-pkgconfig-version=%s'%kw['atleast_pkgconfig_version']] |
173 | 190 | self.cmd_and_log(cmd,env=env) |
184 | 201 | break |
185 | 202 | if'modversion'in kw: |
186 | 203 | version=self.cmd_and_log(path+['--modversion',kw['modversion']],env=env).strip() |
187 | self.define('%s_VERSION'%Utils.quote_define_name(kw.get('uselib_store',kw['modversion'])),version) | |
204 | self.define(kw['define_name'],version) | |
188 | 205 | return version |
189 | 206 | lst=[]+path |
190 | defi=kw.get('define_variable',None) | |
207 | defi=kw.get('define_variable') | |
191 | 208 | if not defi: |
192 | 209 | defi=self.env.PKG_CONFIG_DEFINES or{} |
193 | 210 | for key,val in defi.items(): |
201 | 218 | lst.extend(Utils.to_list(kw['package'])) |
202 | 219 | if'variables'in kw: |
203 | 220 | v_env=kw.get('env',self.env) |
204 | uselib=kw.get('uselib_store',kw['package'].upper()) | |
205 | 221 | vars=Utils.to_list(kw['variables']) |
206 | 222 | for v in vars: |
207 | 223 | val=self.cmd_and_log(lst+['--variable='+v],env=env).strip() |
208 | var='%s_%s'%(uselib,v) | |
224 | var='%s_%s'%(kw['uselib_store'],v) | |
209 | 225 | v_env[var]=val |
210 | 226 | if not'okmsg'in kw: |
211 | 227 | kw['okmsg']='yes' |
214 | 230 | if not'okmsg'in kw: |
215 | 231 | kw['okmsg']='yes' |
216 | 232 | define_it() |
217 | self.parse_flags(ret,kw.get('uselib_store',kw['package'].upper()),kw.get('env',self.env),force_static=static,posix=kw.get('posix',None)) | |
233 | self.parse_flags(ret,kw['uselib_store'],kw.get('env',self.env),force_static=static,posix=kw.get('posix')) | |
218 | 234 | return ret |
219 | 235 | @conf |
220 | 236 | def check_cfg(self,*k,**kw): |
249 | 265 | o=bld(features=bld.kw['features'],source=bld.kw['compile_filename'],target='testprog') |
250 | 266 | for k,v in bld.kw.items(): |
251 | 267 | setattr(o,k,v) |
252 | if not bld.kw.get('quiet',None): | |
268 | if not bld.kw.get('quiet'): | |
253 | 269 | bld.conf.to_log("==>\n%s\n<=="%bld.kw['code']) |
254 | 270 | @conf |
255 | 271 | def validate_c(self,kw): |
260 | 276 | env=kw['env'] |
261 | 277 | if not'compiler'in kw and not'features'in kw: |
262 | 278 | kw['compiler']='c' |
263 | if env['CXX_NAME']and Task.classes.get('cxx',None): | |
279 | if env.CXX_NAME and Task.classes.get('cxx'): | |
264 | 280 | kw['compiler']='cxx' |
265 | if not self.env['CXX']: | |
281 | if not self.env.CXX: | |
266 | 282 | self.fatal('a c++ compiler is required') |
267 | 283 | else: |
268 | if not self.env['CC']: | |
284 | if not self.env.CC: | |
269 | 285 | self.fatal('a c compiler is required') |
270 | 286 | if not'compile_mode'in kw: |
271 | 287 | kw['compile_mode']='c' |
295 | 311 | if not'header_name'in kw: |
296 | 312 | kw['header_name']=[] |
297 | 313 | fwk='%s/%s.h'%(fwkname,fwkname) |
298 | if kw.get('remove_dot_h',None): | |
314 | if kw.get('remove_dot_h'): | |
299 | 315 | fwk=fwk[:-2] |
300 | 316 | kw['header_name']=Utils.to_list(kw['header_name'])+[fwk] |
301 | 317 | kw['msg']='Checking for framework %s'%fwkname |
330 | 346 | if not'msg'in kw: |
331 | 347 | kw['msg']='Checking for header %s'%kw['header_name'] |
332 | 348 | l=Utils.to_list(kw['header_name']) |
333 | assert len(l)>0,'list of headers in header_name is empty' | |
349 | assert len(l),'list of headers in header_name is empty' | |
334 | 350 | kw['code']=to_header(kw)+SNIP_EMPTY_PROGRAM |
335 | 351 | if not'uselib_store'in kw: |
336 | 352 | kw['uselib_store']=l[0].upper() |
362 | 378 | kw['execute']=False |
363 | 379 | if kw['execute']: |
364 | 380 | kw['features'].append('test_exec') |
365 | kw['chmod']=493 | |
381 | kw['chmod']=Utils.O755 | |
366 | 382 | if not'errmsg'in kw: |
367 | 383 | kw['errmsg']='not found' |
368 | 384 | if not'okmsg'in kw: |
390 | 406 | is_success=(kw['success']==0) |
391 | 407 | else: |
392 | 408 | is_success=(kw['success']==0) |
393 | if'define_name'in kw: | |
409 | if kw.get('define_name'): | |
394 | 410 | comment=kw.get('comment','') |
395 | 411 | define_name=kw['define_name'] |
396 | if'header_name'in kw or'function_name'in kw or'type_name'in kw or'fragment'in kw: | |
397 | if kw['execute']and kw.get('define_ret',None)and isinstance(is_success,str): | |
412 | if kw['execute']and kw.get('define_ret')and isinstance(is_success,str): | |
413 | if kw.get('global_define',1): | |
398 | 414 | self.define(define_name,is_success,quote=kw.get('quote',1),comment=comment) |
399 | 415 | else: |
416 | if kw.get('quote',1): | |
417 | succ='"%s"'%is_success | |
418 | else: | |
419 | succ=int(is_success) | |
420 | val='%s=%s'%(define_name,succ) | |
421 | var='DEFINES_%s'%kw['uselib_store'] | |
422 | self.env.append_value(var,val) | |
423 | else: | |
424 | if kw.get('global_define',1): | |
400 | 425 | self.define_cond(define_name,is_success,comment=comment) |
401 | else: | |
402 | self.define_cond(define_name,is_success,comment=comment) | |
403 | if kw.get('global_define',None): | |
404 | self.env[kw['define_name']]=is_success | |
426 | else: | |
427 | var='DEFINES_%s'%kw['uselib_store'] | |
428 | self.env.append_value(var,'%s=%s'%(define_name,int(is_success))) | |
429 | if kw.get('add_have_to_env',1): | |
430 | if kw.get('uselib_store'): | |
431 | self.env[self.have_define(kw['uselib_store'])]=1 | |
432 | else: | |
433 | self.env[define_name]=int(is_success) | |
405 | 434 | if'header_name'in kw: |
406 | 435 | if kw.get('auto_add_header_name',False): |
407 | 436 | self.env.append_value(INCKEYS,Utils.to_list(kw['header_name'])) |
408 | 437 | if is_success and'uselib_store'in kw: |
409 | 438 | from waflib.Tools import ccroot |
410 | _vars=set([]) | |
439 | _vars=set() | |
411 | 440 | for x in kw['features']: |
412 | 441 | if x in ccroot.USELIB_VARS: |
413 | 442 | _vars|=ccroot.USELIB_VARS[x] |
414 | 443 | for k in _vars: |
415 | lk=k.lower() | |
416 | if lk in kw: | |
417 | val=kw[lk] | |
418 | if isinstance(val,str): | |
419 | val=val.rstrip(os.path.sep) | |
420 | self.env.append_unique(k+'_'+kw['uselib_store'],Utils.to_list(val)) | |
444 | x=k.lower() | |
445 | if x in kw: | |
446 | self.env.append_value(k+'_'+kw['uselib_store'],kw[x]) | |
421 | 447 | return is_success |
422 | 448 | @conf |
423 | 449 | def check(self,*k,**kw): |
482 | 508 | return coms.get(key,'') |
483 | 509 | @conf |
484 | 510 | def define(self,key,val,quote=True,comment=''): |
485 | assert key and isinstance(key,str) | |
511 | assert isinstance(key,str) | |
512 | if not key: | |
513 | return | |
486 | 514 | if val is True: |
487 | 515 | val=1 |
488 | 516 | elif val in(False,None): |
493 | 521 | s=quote and'%s="%s"'or'%s=%s' |
494 | 522 | app=s%(key,str(val)) |
495 | 523 | ban=key+'=' |
496 | lst=self.env['DEFINES'] | |
524 | lst=self.env.DEFINES | |
497 | 525 | for x in lst: |
498 | 526 | if x.startswith(ban): |
499 | 527 | lst[lst.index(x)]=app |
504 | 532 | self.set_define_comment(key,comment) |
505 | 533 | @conf |
506 | 534 | def undefine(self,key,comment=''): |
507 | assert key and isinstance(key,str) | |
535 | assert isinstance(key,str) | |
536 | if not key: | |
537 | return | |
508 | 538 | ban=key+'=' |
509 | lst=[x for x in self.env['DEFINES']if not x.startswith(ban)] | |
510 | self.env['DEFINES']=lst | |
539 | lst=[x for x in self.env.DEFINES if not x.startswith(ban)] | |
540 | self.env.DEFINES=lst | |
511 | 541 | self.env.append_unique(DEFKEYS,key) |
512 | 542 | self.set_define_comment(key,comment) |
513 | 543 | @conf |
514 | 544 | def define_cond(self,key,val,comment=''): |
515 | assert key and isinstance(key,str) | |
545 | assert isinstance(key,str) | |
546 | if not key: | |
547 | return | |
516 | 548 | if val: |
517 | 549 | self.define(key,1,comment=comment) |
518 | 550 | else: |
521 | 553 | def is_defined(self,key): |
522 | 554 | assert key and isinstance(key,str) |
523 | 555 | ban=key+'=' |
524 | for x in self.env['DEFINES']: | |
556 | for x in self.env.DEFINES: | |
525 | 557 | if x.startswith(ban): |
526 | 558 | return True |
527 | 559 | return False |
529 | 561 | def get_define(self,key): |
530 | 562 | assert key and isinstance(key,str) |
531 | 563 | ban=key+'=' |
532 | for x in self.env['DEFINES']: | |
564 | for x in self.env.DEFINES: | |
533 | 565 | if x.startswith(ban): |
534 | 566 | return x[len(ban):] |
535 | 567 | return None |
563 | 595 | lst.append('#include <%s>'%x) |
564 | 596 | if defines: |
565 | 597 | tbl={} |
566 | for k in self.env['DEFINES']: | |
598 | for k in self.env.DEFINES: | |
567 | 599 | a,_,b=k.partition('=') |
568 | 600 | tbl[a]=b |
569 | 601 | for k in self.env[DEFKEYS]: |
655 | 687 | Logs.debug('ccroot: dest platform: '+' '.join([conf.env[x]or'?'for x in('DEST_OS','DEST_BINFMT','DEST_CPU')])) |
656 | 688 | if icc: |
657 | 689 | ver=k['__INTEL_COMPILER'] |
658 | conf.env['CC_VERSION']=(ver[:-2],ver[-2],ver[-1]) | |
690 | conf.env.CC_VERSION=(ver[:-2],ver[-2],ver[-1]) | |
659 | 691 | else: |
660 | 692 | if isD('__clang__')and isD('__clang_major__'): |
661 | conf.env['CC_VERSION']=(k['__clang_major__'],k['__clang_minor__'],k['__clang_patchlevel__']) | |
693 | conf.env.CC_VERSION=(k['__clang_major__'],k['__clang_minor__'],k['__clang_patchlevel__']) | |
662 | 694 | else: |
663 | conf.env['CC_VERSION']=(k['__GNUC__'],k['__GNUC_MINOR__'],k.get('__GNUC_PATCHLEVEL__','0')) | |
695 | conf.env.CC_VERSION=(k['__GNUC__'],k['__GNUC_MINOR__'],k.get('__GNUC_PATCHLEVEL__','0')) | |
664 | 696 | return k |
665 | 697 | @conf |
666 | 698 | def get_xlc_version(conf,cc): |
674 | 706 | match=version_re(out or err) |
675 | 707 | if match: |
676 | 708 | k=match.groupdict() |
677 | conf.env['CC_VERSION']=(k['major'],k['minor']) | |
709 | conf.env.CC_VERSION=(k['major'],k['minor']) | |
678 | 710 | break |
679 | 711 | else: |
680 | 712 | conf.fatal('Could not determine the XLC version.') |
694 | 726 | match=version_re(version) |
695 | 727 | if match: |
696 | 728 | k=match.groupdict() |
697 | conf.env['CC_VERSION']=(k['major'],k['minor']) | |
729 | conf.env.CC_VERSION=(k['major'],k['minor']) | |
698 | 730 | else: |
699 | 731 | conf.fatal('Could not determine the suncc version.') |
700 | 732 | @conf |
702 | 734 | if self.env.DEST_BINFMT=='elf'and'gcc'in(self.env.CXX_NAME,self.env.CC_NAME): |
703 | 735 | self.env.append_unique('LINKFLAGS','-Wl,--as-needed') |
704 | 736 | class cfgtask(Task.TaskBase): |
737 | def __init__(self,*k,**kw): | |
738 | Task.TaskBase.__init__(self,*k,**kw) | |
739 | self.run_after=set() | |
705 | 740 | def display(self): |
706 | 741 | return'' |
707 | 742 | def runnable_status(self): |
743 | for x in self.run_after: | |
744 | if not x.hasrun: | |
745 | return Task.ASK_LATER | |
708 | 746 | return Task.RUN_ME |
709 | 747 | def uid(self): |
710 | 748 | return Utils.SIG_NIL |
715 | 753 | bld.init_dirs() |
716 | 754 | bld.in_msg=1 |
717 | 755 | bld.logger=self.logger |
756 | bld.multicheck_task=self | |
757 | args=self.args | |
718 | 758 | try: |
719 | bld.check(**self.args) | |
759 | if'func'in args: | |
760 | bld.test(build_fun=args['func'],msg=args.get('msg',''),okmsg=args.get('okmsg',''),errmsg=args.get('errmsg',''),) | |
761 | else: | |
762 | args['multicheck_mandatory']=args.get('mandatory',True) | |
763 | args['mandatory']=True | |
764 | try: | |
765 | bld.check(**args) | |
766 | finally: | |
767 | args['mandatory']=args['multicheck_mandatory'] | |
720 | 768 | except Exception: |
721 | 769 | return 1 |
770 | def process(self): | |
771 | Task.TaskBase.process(self) | |
772 | if'msg'in self.args: | |
773 | with self.generator.bld.multicheck_lock: | |
774 | self.conf.start_msg(self.args['msg']) | |
775 | if self.hasrun==Task.NOT_RUN: | |
776 | self.conf.end_msg('test cancelled','YELLOW') | |
777 | elif self.hasrun!=Task.SUCCESS: | |
778 | self.conf.end_msg(self.args.get('errmsg','no'),'YELLOW') | |
779 | else: | |
780 | self.conf.end_msg(self.args.get('okmsg','yes'),'GREEN') | |
722 | 781 | @conf |
723 | 782 | def multicheck(self,*k,**kw): |
724 | 783 | self.start_msg(kw.get('msg','Executing %d configuration tests'%len(k)),**kw) |
784 | for var in('DEFINES',DEFKEYS): | |
785 | self.env.append_value(var,[]) | |
786 | self.env.DEFINE_COMMENTS=self.env.DEFINE_COMMENTS or{} | |
725 | 787 | class par(object): |
726 | 788 | def __init__(self): |
727 | 789 | self.keep=False |
728 | self.returned_tasks=[] | |
729 | 790 | self.task_sigs={} |
730 | 791 | self.progress_bar=0 |
731 | 792 | def total(self): |
733 | 794 | def to_log(self,*k,**kw): |
734 | 795 | return |
735 | 796 | bld=par() |
797 | bld.keep=kw.get('run_all_tests',True) | |
736 | 798 | tasks=[] |
799 | id_to_task={} | |
737 | 800 | for dct in k: |
738 | x=cfgtask(bld=bld) | |
801 | x=Task.classes['cfgtask'](bld=bld) | |
739 | 802 | tasks.append(x) |
740 | 803 | x.args=dct |
741 | 804 | x.bld=bld |
742 | 805 | x.conf=self |
743 | 806 | x.args=dct |
744 | 807 | x.logger=Logs.make_mem_logger(str(id(x)),self.logger) |
808 | if'id'in dct: | |
809 | id_to_task[dct['id']]=x | |
810 | for x in tasks: | |
811 | for key in Utils.to_list(x.args.get('before_tests',[])): | |
812 | tsk=id_to_task[key] | |
813 | if not tsk: | |
814 | raise ValueError('No test named %r'%key) | |
815 | tsk.run_after.add(x) | |
816 | for key in Utils.to_list(x.args.get('after_tests',[])): | |
817 | tsk=id_to_task[key] | |
818 | if not tsk: | |
819 | raise ValueError('No test named %r'%key) | |
820 | x.run_after.add(tsk) | |
745 | 821 | def it(): |
746 | 822 | yield tasks |
747 | 823 | while 1: |
748 | 824 | yield[] |
749 | p=Runner.Parallel(bld,Options.options.jobs) | |
825 | bld.producer=p=Runner.Parallel(bld,Options.options.jobs) | |
826 | bld.multicheck_lock=Utils.threading.Lock() | |
750 | 827 | p.biter=it() |
828 | self.end_msg('started') | |
751 | 829 | p.start() |
752 | 830 | for x in tasks: |
753 | 831 | x.logger.memhandler.flush() |
832 | self.start_msg('-> processing test results') | |
754 | 833 | if p.error: |
755 | 834 | for x in p.error: |
756 | 835 | if getattr(x,'err_msg',None): |
757 | 836 | self.to_log(x.err_msg) |
758 | 837 | self.end_msg('fail',color='RED') |
759 | 838 | raise Errors.WafError('There is an error in the library, read config.log for more information') |
839 | failure_count=0 | |
840 | for x in tasks: | |
841 | if x.hasrun not in(Task.SUCCESS,Task.NOT_RUN): | |
842 | failure_count+=1 | |
843 | if failure_count: | |
844 | self.end_msg(kw.get('errmsg','%s test failed'%failure_count),color='YELLOW',**kw) | |
845 | else: | |
846 | self.end_msg('all ok',**kw) | |
760 | 847 | for x in tasks: |
761 | 848 | if x.hasrun!=Task.SUCCESS: |
762 | self.end_msg(kw.get('errmsg','no'),color='YELLOW',**kw) | |
763 | self.fatal(kw.get('fatalmsg',None)or'One of the tests has failed, read config.log for more information') | |
764 | self.end_msg('ok',**kw) | |
849 | if x.args.get('mandatory',True): | |
850 | self.fatal(kw.get('fatalmsg')or'One of the tests has failed, read config.log for more information') |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | 4 | import os,shutil,platform |
5 | from waflib import Task,Utils,Errors | |
5 | from waflib import Task,Utils | |
6 | 6 | from waflib.TaskGen import taskgen_method,feature,after_method,before_method |
7 | 7 | app_info=''' |
8 | 8 | <?xml version="1.0" encoding="UTF-8"?> |
24 | 24 | ''' |
25 | 25 | @feature('c','cxx') |
26 | 26 | def set_macosx_deployment_target(self): |
27 | if self.env['MACOSX_DEPLOYMENT_TARGET']: | |
28 | os.environ['MACOSX_DEPLOYMENT_TARGET']=self.env['MACOSX_DEPLOYMENT_TARGET'] | |
27 | if self.env.MACOSX_DEPLOYMENT_TARGET: | |
28 | os.environ['MACOSX_DEPLOYMENT_TARGET']=self.env.MACOSX_DEPLOYMENT_TARGET | |
29 | 29 | elif'MACOSX_DEPLOYMENT_TARGET'not in os.environ: |
30 | 30 | if Utils.unversioned_sys_platform()=='darwin': |
31 | 31 | os.environ['MACOSX_DEPLOYMENT_TARGET']='.'.join(platform.mac_ver()[0].split('.')[:2]) |
47 | 47 | @feature('cprogram','cxxprogram') |
48 | 48 | @after_method('apply_link') |
49 | 49 | def create_task_macapp(self): |
50 | if self.env['MACAPP']or getattr(self,'mac_app',False): | |
50 | if self.env.MACAPP or getattr(self,'mac_app',False): | |
51 | 51 | out=self.link_task.outputs[0] |
52 | 52 | name=bundle_name_for_output(out) |
53 | 53 | dir=self.create_bundle_dirs(name,out) |
54 | 54 | n1=dir.find_or_declare(['Contents','MacOS',out.name]) |
55 | 55 | self.apptask=self.create_task('macapp',self.link_task.outputs,n1) |
56 | 56 | inst_to=getattr(self,'install_path','/Applications')+'/%s/Contents/MacOS/'%name |
57 | self.bld.install_files(inst_to,n1,chmod=Utils.O755) | |
57 | self.add_install_files(install_to=inst_to,install_from=n1,chmod=Utils.O755) | |
58 | 58 | if getattr(self,'mac_files',None): |
59 | 59 | mac_files_root=getattr(self,'mac_files_root',None) |
60 | 60 | if isinstance(mac_files_root,str): |
66 | 66 | for node in self.to_nodes(self.mac_files): |
67 | 67 | relpath=node.path_from(mac_files_root or node.parent) |
68 | 68 | self.create_task('macapp',node,res_dir.make_node(relpath)) |
69 | self.bld.install_as(os.path.join(inst_to,relpath),node) | |
70 | if getattr(self,'mac_resources',None): | |
71 | res_dir=n1.parent.parent.make_node('Resources') | |
72 | inst_to=getattr(self,'install_path','/Applications')+'/%s/Resources'%name | |
73 | for x in self.to_list(self.mac_resources): | |
74 | node=self.path.find_node(x) | |
75 | if not node: | |
76 | raise Errors.WafError('Missing mac_resource %r in %r'%(x,self)) | |
77 | parent=node.parent | |
78 | if os.path.isdir(node.abspath()): | |
79 | nodes=node.ant_glob('**') | |
80 | else: | |
81 | nodes=[node] | |
82 | for node in nodes: | |
83 | rel=node.path_from(parent) | |
84 | self.create_task('macapp',node,res_dir.make_node(rel)) | |
85 | self.bld.install_as(inst_to+'/%s'%rel,node) | |
69 | self.add_install_as(install_to=os.path.join(inst_to,relpath),install_source=node) | |
86 | 70 | if getattr(self.bld,'is_install',None): |
87 | 71 | self.install_task.hasrun=Task.SKIP_ME |
88 | 72 | @feature('cprogram','cxxprogram') |
89 | 73 | @after_method('apply_link') |
90 | 74 | def create_task_macplist(self): |
91 | if self.env['MACAPP']or getattr(self,'mac_app',False): | |
75 | if self.env.MACAPP or getattr(self,'mac_app',False): | |
92 | 76 | out=self.link_task.outputs[0] |
93 | 77 | name=bundle_name_for_output(out) |
94 | 78 | dir=self.create_bundle_dirs(name,out) |
107 | 91 | else: |
108 | 92 | plisttask.code=app_info |
109 | 93 | inst_to=getattr(self,'install_path','/Applications')+'/%s/Contents/'%name |
110 | self.bld.install_files(inst_to,n1) | |
94 | self.add_install_files(install_to=inst_to,install_from=n1) | |
111 | 95 | @feature('cshlib','cxxshlib') |
112 | 96 | @before_method('apply_link','propagate_uselib_vars') |
113 | 97 | def apply_bundle(self): |
114 | if self.env['MACBUNDLE']or getattr(self,'mac_bundle',False): | |
115 | self.env['LINKFLAGS_cshlib']=self.env['LINKFLAGS_cxxshlib']=[] | |
116 | self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['macbundle_PATTERN'] | |
98 | if self.env.MACBUNDLE or getattr(self,'mac_bundle',False): | |
99 | self.env.LINKFLAGS_cshlib=self.env.LINKFLAGS_cxxshlib=[] | |
100 | self.env.cshlib_PATTERN=self.env.cxxshlib_PATTERN=self.env.macbundle_PATTERN | |
117 | 101 | use=self.use=self.to_list(getattr(self,'use',[])) |
118 | 102 | if not'MACBUNDLE'in use: |
119 | 103 | use.append('MACBUNDLE') |
3 | 3 | |
4 | 4 | import re,string,traceback |
5 | 5 | from waflib import Logs,Utils,Errors |
6 | from waflib.Logs import debug,error | |
7 | 6 | class PreprocError(Errors.WafError): |
8 | 7 | pass |
8 | FILE_CACHE_SIZE=100000 | |
9 | LINE_CACHE_SIZE=100000 | |
9 | 10 | POPFILE='-' |
10 | 11 | recursion_limit=150 |
11 | 12 | go_absolute=False |
15 | 16 | use_trigraphs=0 |
16 | 17 | strict_quotes=0 |
17 | 18 | g_optrans={'not':'!','not_eq':'!','and':'&&','and_eq':'&=','or':'||','or_eq':'|=','xor':'^','xor_eq':'^=','bitand':'&','bitor':'|','compl':'~',} |
18 | re_lines=re.compile('^[ \t]*(#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*)\r*$',re.IGNORECASE|re.MULTILINE) | |
19 | re_lines=re.compile('^[ \t]*(?:#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*)\r*$',re.IGNORECASE|re.MULTILINE) | |
19 | 20 | re_mac=re.compile("^[a-zA-Z_]\w*") |
20 | 21 | re_fun=re.compile('^[a-zA-Z_][a-zA-Z0-9_]*[(]') |
21 | 22 | re_pragma_once=re.compile('^\s*once\s*',re.IGNORECASE) |
36 | 37 | undefined='u' |
37 | 38 | skipped='s' |
38 | 39 | def repl(m): |
39 | s=m.group(0) | |
40 | if s.startswith('/'): | |
40 | s=m.group() | |
41 | if s[0]=='/': | |
41 | 42 | return' ' |
42 | 43 | return s |
43 | def filter_comments(filename): | |
44 | code=Utils.readf(filename) | |
45 | if use_trigraphs: | |
46 | for(a,b)in trig_def:code=code.split(a).join(b) | |
47 | code=re_nl.sub('',code) | |
48 | code=re_cpp.sub(repl,code) | |
49 | return[(m.group(2),m.group(3))for m in re.finditer(re_lines,code)] | |
50 | 44 | prec={} |
51 | 45 | ops=['* / %','+ -','<< >>','< <= >= >','== !=','& | ^','&& ||',','] |
52 | for x in range(len(ops)): | |
53 | syms=ops[x] | |
46 | for x,syms in enumerate(ops): | |
54 | 47 | for u in syms.split(): |
55 | 48 | prec[u]=x |
56 | 49 | def trimquotes(s): |
86 | 79 | else:c=0 |
87 | 80 | return c |
88 | 81 | def get_num(lst): |
89 | if not lst:raise PreprocError("empty list for get_num") | |
82 | if not lst:raise PreprocError('empty list for get_num') | |
90 | 83 | (p,v)=lst[0] |
91 | 84 | if p==OP: |
92 | 85 | if v=='(': |
103 | 96 | count_par+=1 |
104 | 97 | i+=1 |
105 | 98 | else: |
106 | raise PreprocError("rparen expected %r"%lst) | |
99 | raise PreprocError('rparen expected %r'%lst) | |
107 | 100 | (num,_)=get_term(lst[1:i]) |
108 | 101 | return(num,lst[i+1:]) |
109 | 102 | elif v=='+': |
118 | 111 | num,lst=get_num(lst[1:]) |
119 | 112 | return(~int(num),lst) |
120 | 113 | else: |
121 | raise PreprocError("Invalid op token %r for get_num"%lst) | |
114 | raise PreprocError('Invalid op token %r for get_num'%lst) | |
122 | 115 | elif p==NUM: |
123 | 116 | return v,lst[1:] |
124 | 117 | elif p==IDENT: |
125 | 118 | return 0,lst[1:] |
126 | 119 | else: |
127 | raise PreprocError("Invalid token %r for get_num"%lst) | |
120 | raise PreprocError('Invalid token %r for get_num'%lst) | |
128 | 121 | def get_term(lst): |
129 | if not lst:raise PreprocError("empty list for get_term") | |
122 | if not lst:raise PreprocError('empty list for get_term') | |
130 | 123 | num,lst=get_num(lst) |
131 | 124 | if not lst: |
132 | 125 | return(num,[]) |
149 | 142 | break |
150 | 143 | i+=1 |
151 | 144 | else: |
152 | raise PreprocError("rparen expected %r"%lst) | |
145 | raise PreprocError('rparen expected %r'%lst) | |
153 | 146 | if int(num): |
154 | 147 | return get_term(lst[1:i]) |
155 | 148 | else: |
161 | 154 | return get_term([(NUM,num2)]+lst) |
162 | 155 | p2,v2=lst[0] |
163 | 156 | if p2!=OP: |
164 | raise PreprocError("op expected %r"%lst) | |
157 | raise PreprocError('op expected %r'%lst) | |
165 | 158 | if prec[v2]>=prec[v]: |
166 | 159 | num2=reduce_nums(num,num2,v) |
167 | 160 | return get_term([(NUM,num2)]+lst) |
169 | 162 | num3,lst=get_num(lst[1:]) |
170 | 163 | num3=reduce_nums(num2,num3,v2) |
171 | 164 | return get_term([(NUM,num),(p,v),(NUM,num3)]+lst) |
172 | raise PreprocError("cannot reduce %r"%lst) | |
165 | raise PreprocError('cannot reduce %r'%lst) | |
173 | 166 | def reduce_eval(lst): |
174 | 167 | num,lst=get_term(lst) |
175 | 168 | return(NUM,num) |
209 | 202 | else: |
210 | 203 | lst[i]=(NUM,0) |
211 | 204 | else: |
212 | raise PreprocError("Invalid define expression %r"%lst) | |
205 | raise PreprocError('Invalid define expression %r'%lst) | |
213 | 206 | elif p==IDENT and v in defs: |
214 | 207 | if isinstance(defs[v],str): |
215 | 208 | a,b=extract_macro(defs[v]) |
220 | 213 | del lst[i] |
221 | 214 | accu=to_add[:] |
222 | 215 | reduce_tokens(accu,defs,ban+[v]) |
223 | for x in range(len(accu)): | |
224 | lst.insert(i,accu[x]) | |
216 | for tmp in accu: | |
217 | lst.insert(i,tmp) | |
225 | 218 | i+=1 |
226 | 219 | else: |
227 | 220 | args=[] |
228 | 221 | del lst[i] |
229 | 222 | if i>=len(lst): |
230 | raise PreprocError("expected '(' after %r (got nothing)"%v) | |
223 | raise PreprocError('expected ( after %r (got nothing)'%v) | |
231 | 224 | (p2,v2)=lst[i] |
232 | 225 | if p2!=OP or v2!='(': |
233 | raise PreprocError("expected '(' after %r"%v) | |
226 | raise PreprocError('expected ( after %r'%v) | |
234 | 227 | del lst[i] |
235 | 228 | one_param=[] |
236 | 229 | count_paren=0 |
245 | 238 | if one_param:args.append(one_param) |
246 | 239 | break |
247 | 240 | elif v2==',': |
248 | if not one_param:raise PreprocError("empty param in funcall %s"%v) | |
241 | if not one_param:raise PreprocError('empty param in funcall %r'%v) | |
249 | 242 | args.append(one_param) |
250 | 243 | one_param=[] |
251 | 244 | else: |
313 | 306 | i+=1 |
314 | 307 | def eval_macro(lst,defs): |
315 | 308 | reduce_tokens(lst,defs,[]) |
316 | if not lst:raise PreprocError("missing tokens to evaluate") | |
309 | if not lst:raise PreprocError('missing tokens to evaluate') | |
317 | 310 | (p,v)=reduce_eval(lst) |
318 | 311 | return int(v)!=0 |
319 | 312 | def extract_macro(txt): |
321 | 314 | if re_fun.search(txt): |
322 | 315 | p,name=t[0] |
323 | 316 | p,v=t[1] |
324 | if p!=OP:raise PreprocError("expected open parenthesis") | |
317 | if p!=OP:raise PreprocError('expected (') | |
325 | 318 | i=1 |
326 | 319 | pindex=0 |
327 | 320 | params={} |
337 | 330 | elif p==OP and v==')': |
338 | 331 | break |
339 | 332 | else: |
340 | raise PreprocError("unexpected token (3)") | |
333 | raise PreprocError('unexpected token (3)') | |
341 | 334 | elif prev==IDENT: |
342 | 335 | if p==OP and v==',': |
343 | 336 | prev=v |
344 | 337 | elif p==OP and v==')': |
345 | 338 | break |
346 | 339 | else: |
347 | raise PreprocError("comma or ... expected") | |
340 | raise PreprocError('comma or ... expected') | |
348 | 341 | elif prev==',': |
349 | 342 | if p==IDENT: |
350 | 343 | params[v]=pindex |
351 | 344 | pindex+=1 |
352 | 345 | prev=p |
353 | 346 | elif p==OP and v=='...': |
354 | raise PreprocError("not implemented (1)") | |
347 | raise PreprocError('not implemented (1)') | |
355 | 348 | else: |
356 | raise PreprocError("comma or ... expected (2)") | |
349 | raise PreprocError('comma or ... expected (2)') | |
357 | 350 | elif prev=='...': |
358 | raise PreprocError("not implemented (2)") | |
351 | raise PreprocError('not implemented (2)') | |
359 | 352 | else: |
360 | raise PreprocError("unexpected else") | |
353 | raise PreprocError('unexpected else') | |
361 | 354 | return(name,[params,t[i+1:]]) |
362 | 355 | else: |
363 | 356 | (p,v)=t[0] |
365 | 358 | return(v,[[],t[1:]]) |
366 | 359 | else: |
367 | 360 | return(v,[[],[('T','')]]) |
368 | re_include=re.compile('^\s*(<(?P<a>.*)>|"(?P<b>.*)")') | |
361 | re_include=re.compile('^\s*(<(?:.*)>|"(?:.*)")') | |
369 | 362 | def extract_include(txt,defs): |
370 | 363 | m=re_include.search(txt) |
371 | 364 | if m: |
372 | if m.group('a'):return'<',m.group('a') | |
373 | if m.group('b'):return'"',m.group('b') | |
365 | txt=m.group(1) | |
366 | return txt[0],txt[1:-1] | |
374 | 367 | toks=tokenize(txt) |
375 | 368 | reduce_tokens(toks,defs,['waf_include']) |
376 | 369 | if not toks: |
377 | raise PreprocError("could not parse include %s"%txt) | |
370 | raise PreprocError('could not parse include %r'%txt) | |
378 | 371 | if len(toks)==1: |
379 | 372 | if toks[0][0]==STR: |
380 | 373 | return'"',toks[0][1] |
382 | 375 | if toks[0][1]=='<'and toks[-1][1]=='>': |
383 | 376 | ret='<',stringize(toks).lstrip('<').rstrip('>') |
384 | 377 | return ret |
385 | raise PreprocError("could not parse include %s."%txt) | |
378 | raise PreprocError('could not parse include %r'%txt) | |
386 | 379 | def parse_char(txt): |
387 | if not txt:raise PreprocError("attempted to parse a null char") | |
380 | if not txt: | |
381 | raise PreprocError('attempted to parse a null char') | |
388 | 382 | if txt[0]!='\\': |
389 | 383 | return ord(txt) |
390 | 384 | c=txt[1] |
398 | 392 | return(1+i,int(txt[1:1+i],8)) |
399 | 393 | else: |
400 | 394 | try:return chr_esc[c] |
401 | except KeyError:raise PreprocError("could not parse char literal '%s'"%txt) | |
395 | except KeyError:raise PreprocError('could not parse char literal %r'%txt) | |
402 | 396 | def tokenize(s): |
403 | 397 | return tokenize_private(s)[:] |
404 | @Utils.run_once | |
405 | 398 | def tokenize_private(s): |
406 | 399 | ret=[] |
407 | 400 | for match in re_clexer.finditer(s): |
436 | 429 | ret.append((name,v)) |
437 | 430 | break |
438 | 431 | return ret |
439 | @Utils.run_once | |
440 | def define_name(line): | |
441 | return re_mac.match(line).group(0) | |
432 | def format_defines(lst): | |
433 | ret=[] | |
434 | for y in lst: | |
435 | if y: | |
436 | pos=y.find('=') | |
437 | if pos==-1: | |
438 | ret.append(y) | |
439 | elif pos>0: | |
440 | ret.append('%s %s'%(y[:pos],y[pos+1:])) | |
441 | else: | |
442 | raise ValueError('Invalid define expression %r'%y) | |
443 | return ret | |
442 | 444 | class c_parser(object): |
443 | 445 | def __init__(self,nodepaths=None,defines=None): |
444 | 446 | self.lines=[] |
453 | 455 | self.nodes=[] |
454 | 456 | self.names=[] |
455 | 457 | self.curfile='' |
456 | self.ban_includes=set([]) | |
458 | self.ban_includes=set() | |
457 | 459 | def cached_find_resource(self,node,filename): |
458 | 460 | try: |
459 | nd=node.ctx.cache_nd | |
461 | cache=node.ctx.preproc_cache_node | |
460 | 462 | except AttributeError: |
461 | nd=node.ctx.cache_nd={} | |
462 | tup=(node,filename) | |
463 | global FILE_CACHE_SIZE | |
464 | cache=node.ctx.preproc_cache_node=Utils.lru_cache(FILE_CACHE_SIZE) | |
465 | key=(node,filename) | |
463 | 466 | try: |
464 | return nd[tup] | |
467 | return cache[key] | |
465 | 468 | except KeyError: |
466 | 469 | ret=node.find_resource(filename) |
467 | 470 | if ret: |
471 | 474 | tmp=node.ctx.srcnode.search_node(ret.path_from(node.ctx.bldnode)) |
472 | 475 | if tmp and getattr(tmp,'children',None): |
473 | 476 | ret=None |
474 | nd[tup]=ret | |
477 | cache[key]=ret | |
475 | 478 | return ret |
476 | 479 | def tryfind(self,filename): |
477 | 480 | if filename.endswith('.moc'): |
490 | 493 | if not filename in self.names: |
491 | 494 | self.names.append(filename) |
492 | 495 | return found |
496 | def filter_comments(self,node): | |
497 | code=node.read() | |
498 | if use_trigraphs: | |
499 | for(a,b)in trig_def:code=code.split(a).join(b) | |
500 | code=re_nl.sub('',code) | |
501 | code=re_cpp.sub(repl,code) | |
502 | return re_lines.findall(code) | |
503 | def parse_lines(self,node): | |
504 | try: | |
505 | cache=node.ctx.preproc_cache_lines | |
506 | except AttributeError: | |
507 | global LINE_CACHE_SIZE | |
508 | cache=node.ctx.preproc_cache_lines=Utils.lru_cache(LINE_CACHE_SIZE) | |
509 | try: | |
510 | return cache[node] | |
511 | except KeyError: | |
512 | cache[node]=lines=self.filter_comments(node) | |
513 | lines.append((POPFILE,'')) | |
514 | lines.reverse() | |
515 | return lines | |
493 | 516 | def addlines(self,node): |
494 | 517 | self.currentnode_stack.append(node.parent) |
495 | filepath=node.abspath() | |
496 | 518 | self.count_files+=1 |
497 | 519 | if self.count_files>recursion_limit: |
498 | raise PreprocError("recursion limit exceeded") | |
499 | pc=self.parse_cache | |
500 | debug('preproc: reading file %r',filepath) | |
520 | raise PreprocError('recursion limit exceeded') | |
521 | if Logs.verbose: | |
522 | Logs.debug('preproc: reading file %r',node) | |
501 | 523 | try: |
502 | lns=pc[filepath] | |
503 | except KeyError: | |
504 | pass | |
505 | else: | |
506 | self.lines.extend(lns) | |
507 | return | |
508 | try: | |
509 | lines=filter_comments(filepath) | |
510 | lines.append((POPFILE,'')) | |
511 | lines.reverse() | |
512 | pc[filepath]=lines | |
513 | self.lines.extend(lines) | |
514 | except IOError: | |
515 | raise PreprocError("could not read the file %s"%filepath) | |
524 | lines=self.parse_lines(node) | |
525 | except EnvironmentError: | |
526 | raise PreprocError('could not read the file %r'%node) | |
516 | 527 | except Exception: |
517 | 528 | if Logs.verbose>0: |
518 | error("parsing %s failed"%filepath) | |
529 | Logs.error('parsing %r failed',node) | |
519 | 530 | traceback.print_exc() |
531 | else: | |
532 | self.lines.extend(lines) | |
520 | 533 | def start(self,node,env): |
521 | debug('preproc: scanning %s (in %s)',node.name,node.parent.name) | |
522 | bld=node.ctx | |
523 | try: | |
524 | self.parse_cache=bld.parse_cache | |
525 | except AttributeError: | |
526 | self.parse_cache=bld.parse_cache={} | |
534 | Logs.debug('preproc: scanning %s (in %s)',node.name,node.parent.name) | |
527 | 535 | self.current_file=node |
528 | 536 | self.addlines(node) |
529 | if env['DEFINES']: | |
530 | try: | |
531 | lst=['%s %s'%(x[0],trimquotes('='.join(x[1:])))for x in[y.split('=')for y in env['DEFINES']]] | |
532 | lst.reverse() | |
533 | self.lines.extend([('define',x)for x in lst]) | |
534 | except AttributeError: | |
535 | pass | |
537 | if env.DEFINES: | |
538 | lst=format_defines(env.DEFINES) | |
539 | lst.reverse() | |
540 | self.lines.extend([('define',x)for x in lst]) | |
536 | 541 | while self.lines: |
537 | 542 | (token,line)=self.lines.pop() |
538 | 543 | if token==POPFILE: |
541 | 546 | continue |
542 | 547 | try: |
543 | 548 | ve=Logs.verbose |
544 | if ve:debug('preproc: line is %s - %s state is %s',token,line,self.state) | |
549 | if ve:Logs.debug('preproc: line is %s - %s state is %s',token,line,self.state) | |
545 | 550 | state=self.state |
546 | 551 | if token[:2]=='if': |
547 | 552 | state.append(undefined) |
556 | 561 | else:state[-1]=ignored |
557 | 562 | elif token=='ifdef': |
558 | 563 | m=re_mac.match(line) |
559 | if m and m.group(0)in self.defs:state[-1]=accepted | |
564 | if m and m.group()in self.defs:state[-1]=accepted | |
560 | 565 | else:state[-1]=ignored |
561 | 566 | elif token=='ifndef': |
562 | 567 | m=re_mac.match(line) |
563 | if m and m.group(0)in self.defs:state[-1]=ignored | |
568 | if m and m.group()in self.defs:state[-1]=ignored | |
564 | 569 | else:state[-1]=accepted |
565 | 570 | elif token=='include'or token=='import': |
566 | 571 | (kind,inc)=extract_include(line,self.defs) |
567 | if ve:debug('preproc: include found %s (%s) ',inc,kind) | |
572 | if ve:Logs.debug('preproc: include found %s (%s) ',inc,kind) | |
568 | 573 | if kind=='"'or not strict_quotes: |
569 | 574 | self.current_file=self.tryfind(inc) |
570 | 575 | if token=='import': |
580 | 585 | elif state[-1]==ignored:state[-1]=accepted |
581 | 586 | elif token=='define': |
582 | 587 | try: |
583 | self.defs[define_name(line)]=line | |
584 | except Exception: | |
585 | raise PreprocError("Invalid define line %s"%line) | |
588 | self.defs[self.define_name(line)]=line | |
589 | except AttributeError: | |
590 | raise PreprocError('Invalid define line %r'%line) | |
586 | 591 | elif token=='undef': |
587 | 592 | m=re_mac.match(line) |
588 | if m and m.group(0)in self.defs: | |
589 | self.defs.__delitem__(m.group(0)) | |
593 | if m and m.group()in self.defs: | |
594 | self.defs.__delitem__(m.group()) | |
590 | 595 | elif token=='pragma': |
591 | 596 | if re_pragma_once.match(line.lower()): |
592 | 597 | self.ban_includes.add(self.current_file) |
593 | 598 | except Exception as e: |
594 | 599 | if Logs.verbose: |
595 | debug('preproc: line parsing failed (%s): %s %s',e,line,Utils.ex_stack()) | |
600 | Logs.debug('preproc: line parsing failed (%s): %s %s',e,line,Utils.ex_stack()) | |
601 | def define_name(self,line): | |
602 | return re_mac.match(line).group() | |
596 | 603 | def scan(task): |
597 | 604 | global go_absolute |
598 | 605 | try: |
605 | 612 | nodepaths=[x for x in incn if x.is_child_of(x.ctx.srcnode)or x.is_child_of(x.ctx.bldnode)] |
606 | 613 | tmp=c_parser(nodepaths) |
607 | 614 | tmp.start(task.inputs[0],task.env) |
608 | if Logs.verbose: | |
609 | debug('deps: deps for %r: %r; unresolved %r'%(task.inputs,tmp.nodes,tmp.names)) | |
610 | 615 | return(tmp.nodes,tmp.names) |
46 | 46 | mode='c' |
47 | 47 | if self.env.CXX: |
48 | 48 | mode='cxx' |
49 | self.check(compile_filename=[],features='link_lib_test',msg='Checking for libraries',mode=mode,test_exec=test_exec,) | |
49 | self.check(compile_filename=[],features='link_lib_test',msg='Checking for libraries',mode=mode,test_exec=test_exec) | |
50 | 50 | INLINE_CODE=''' |
51 | 51 | typedef int foo_t; |
52 | 52 | static %s foo_t static_foo () {return 0; } |
147 | 147 | tmp=[] |
148 | 148 | def check_msg(self): |
149 | 149 | return tmp[0] |
150 | self.check(fragment=ENDIAN_FRAGMENT,features='c grep_for_endianness',msg="Checking for endianness",define='ENDIANNESS',tmp=tmp,okmsg=check_msg) | |
150 | self.check(fragment=ENDIAN_FRAGMENT,features='c grep_for_endianness',msg='Checking for endianness',define='ENDIANNESS',tmp=tmp,okmsg=check_msg) | |
151 | 151 | return tmp[0] |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | 4 | import os,re |
5 | from waflib import Task,Utils,Node,Errors | |
5 | from waflib import Task,Utils,Node,Errors,Logs | |
6 | 6 | from waflib.TaskGen import after_method,before_method,feature,taskgen_method,extension |
7 | 7 | from waflib.Tools import c_aliases,c_preproc,c_config,c_osx,c_tests |
8 | 8 | from waflib.Configure import conf |
31 | 31 | @taskgen_method |
32 | 32 | def to_incnodes(self,inlst): |
33 | 33 | lst=[] |
34 | seen=set([]) | |
34 | seen=set() | |
35 | 35 | for x in self.to_list(inlst): |
36 | 36 | if x in seen or not x: |
37 | 37 | continue |
56 | 56 | @feature('c','cxx','d','asm','fc','includes') |
57 | 57 | @after_method('propagate_uselib_vars','process_source') |
58 | 58 | def apply_incpaths(self): |
59 | lst=self.to_incnodes(self.to_list(getattr(self,'includes',[]))+self.env['INCLUDES']) | |
59 | lst=self.to_incnodes(self.to_list(getattr(self,'includes',[]))+self.env.INCLUDES) | |
60 | 60 | self.includes_nodes=lst |
61 | self.env['INCPATHS']=[x.abspath()for x in lst] | |
61 | cwd=self.get_cwd() | |
62 | self.env.INCPATHS=[x.path_from(cwd)for x in lst] | |
62 | 63 | class link_task(Task.Task): |
63 | 64 | color='YELLOW' |
64 | 65 | inst_to=None |
65 | 66 | chmod=Utils.O755 |
66 | 67 | def add_target(self,target): |
67 | 68 | if isinstance(target,str): |
69 | base=self.generator.path | |
70 | if target.startswith('#'): | |
71 | target=target[1:] | |
72 | base=self.generator.bld.bldnode | |
68 | 73 | pattern=self.env[self.__class__.__name__+'_PATTERN'] |
69 | 74 | if not pattern: |
70 | 75 | pattern='%s' |
81 | 86 | tmp=folder+os.sep+pattern%name |
82 | 87 | else: |
83 | 88 | tmp=pattern%name |
84 | target=self.generator.path.find_or_declare(tmp) | |
89 | target=base.find_or_declare(tmp) | |
85 | 90 | self.set_outputs(target) |
91 | def exec_command(self,*k,**kw): | |
92 | ret=super(link_task,self).exec_command(*k,**kw) | |
93 | if not ret and self.env.DO_MANIFEST: | |
94 | ret=self.exec_mf() | |
95 | return ret | |
96 | def exec_mf(self): | |
97 | if not self.env.MT: | |
98 | return 0 | |
99 | manifest=None | |
100 | for out_node in self.outputs: | |
101 | if out_node.name.endswith('.manifest'): | |
102 | manifest=out_node.abspath() | |
103 | break | |
104 | else: | |
105 | return 0 | |
106 | mode='' | |
107 | for x in Utils.to_list(self.generator.features): | |
108 | if x in('cprogram','cxxprogram','fcprogram','fcprogram_test'): | |
109 | mode=1 | |
110 | elif x in('cshlib','cxxshlib','fcshlib'): | |
111 | mode=2 | |
112 | Logs.debug('msvc: embedding manifest in mode %r',mode) | |
113 | lst=[]+self.env.MT | |
114 | lst.extend(Utils.to_list(self.env.MTFLAGS)) | |
115 | lst.extend(['-manifest',manifest]) | |
116 | lst.append('-outputresource:%s;%s'%(self.outputs[0].abspath(),mode)) | |
117 | return super(link_task,self).exec_command(lst) | |
86 | 118 | class stlink_task(link_task): |
87 | 119 | run_str='${AR} ${ARFLAGS} ${AR_TGT_F}${TGT} ${AR_SRC_F}${SRC}' |
88 | 120 | chmod=Utils.O644 |
116 | 148 | except AttributeError: |
117 | 149 | inst_to=self.link_task.__class__.inst_to |
118 | 150 | if inst_to: |
119 | self.install_task=self.bld.install_files(inst_to,self.link_task.outputs[:],env=self.env,chmod=self.link_task.chmod,task=self.link_task) | |
151 | self.install_task=self.add_install_files(install_to=inst_to,install_from=self.link_task.outputs[:],chmod=self.link_task.chmod,task=self.link_task) | |
120 | 152 | @taskgen_method |
121 | 153 | def use_rec(self,name,**kw): |
122 | 154 | if name in self.tmp_use_not or name in self.tmp_use_seen: |
155 | 187 | @before_method('apply_incpaths','propagate_uselib_vars') |
156 | 188 | @after_method('apply_link','process_source') |
157 | 189 | def process_use(self): |
158 | use_not=self.tmp_use_not=set([]) | |
190 | use_not=self.tmp_use_not=set() | |
159 | 191 | self.tmp_use_seen=[] |
160 | 192 | use_prec=self.tmp_use_prec={} |
161 | 193 | self.uselib=self.to_list(getattr(self,'uselib',[])) |
166 | 198 | for x in use_not: |
167 | 199 | if x in use_prec: |
168 | 200 | del use_prec[x] |
169 | out=[] | |
201 | out=self.tmp_use_sorted=[] | |
170 | 202 | tmp=[] |
171 | 203 | for x in self.tmp_use_seen: |
172 | 204 | for k in use_prec.values(): |
200 | 232 | if var=='LIB'or y.tmp_use_stlib or x in names: |
201 | 233 | self.env.append_value(var,[y.target[y.target.rfind(os.sep)+1:]]) |
202 | 234 | self.link_task.dep_nodes.extend(y.link_task.outputs) |
203 | tmp_path=y.link_task.outputs[0].parent.path_from(self.bld.bldnode) | |
235 | tmp_path=y.link_task.outputs[0].parent.path_from(self.get_cwd()) | |
204 | 236 | self.env.append_unique(var+'PATH',[tmp_path]) |
205 | 237 | else: |
206 | 238 | if y.tmp_use_objects: |
235 | 267 | link_task.inputs.append(x) |
236 | 268 | @taskgen_method |
237 | 269 | def get_uselib_vars(self): |
238 | _vars=set([]) | |
270 | _vars=set() | |
239 | 271 | for x in self.features: |
240 | 272 | if x in USELIB_VARS: |
241 | 273 | _vars|=USELIB_VARS[x] |
266 | 298 | name=self.target.name |
267 | 299 | else: |
268 | 300 | name=os.path.split(self.target)[1] |
269 | implib=self.env['implib_PATTERN']%name | |
301 | implib=self.env.implib_PATTERN%name | |
270 | 302 | implib=dll.parent.find_or_declare(implib) |
271 | self.env.append_value('LINKFLAGS',self.env['IMPLIB_ST']%implib.bldpath()) | |
303 | self.env.append_value('LINKFLAGS',self.env.IMPLIB_ST%implib.bldpath()) | |
272 | 304 | self.link_task.outputs.append(implib) |
273 | 305 | if getattr(self,'defs',None)and self.env.DEST_BINFMT=='pe': |
274 | 306 | node=self.path.find_resource(self.defs) |
275 | 307 | if not node: |
276 | 308 | raise Errors.WafError('invalid def file %r'%self.defs) |
277 | 309 | if'msvc'in(self.env.CC_NAME,self.env.CXX_NAME): |
278 | self.env.append_value('LINKFLAGS','/def:%s'%node.path_from(self.bld.bldnode)) | |
310 | self.env.append_value('LINKFLAGS','/def:%s'%node.path_from(self.get_cwd())) | |
279 | 311 | self.link_task.dep_nodes.append(node) |
280 | 312 | else: |
281 | 313 | self.link_task.inputs.append(node) |
287 | 319 | inst_to=self.install_path |
288 | 320 | except AttributeError: |
289 | 321 | inst_to='${IMPLIBDIR}' |
290 | self.install_task.dest='${BINDIR}' | |
322 | self.install_task.install_to='${BINDIR}' | |
291 | 323 | if not self.env.IMPLIBDIR: |
292 | 324 | self.env.IMPLIBDIR=self.env.LIBDIR |
293 | self.implib_install_task=self.bld.install_files(inst_to,implib,env=self.env,chmod=self.link_task.chmod,task=self.link_task) | |
325 | self.implib_install_task=self.add_install_files(install_to=inst_to,install_from=implib,chmod=self.link_task.chmod,task=self.link_task) | |
294 | 326 | re_vnum=re.compile('^([1-9]\\d*|0)([.]([1-9]\\d*|0)){0,2}?$') |
295 | 327 | @feature('cshlib','cxxshlib','dshlib','fcshlib','vnum') |
296 | 328 | @after_method('apply_link','propagate_uselib_vars') |
317 | 349 | v=self.env.SONAME_ST%name2 |
318 | 350 | self.env.append_value('LINKFLAGS',v.split()) |
319 | 351 | if self.env.DEST_OS!='openbsd': |
320 | outs=[node.parent.find_or_declare(name3)] | |
352 | outs=[node.parent.make_node(name3)] | |
321 | 353 | if name2!=name3: |
322 | outs.append(node.parent.find_or_declare(name2)) | |
354 | outs.append(node.parent.make_node(name2)) | |
323 | 355 | self.create_task('vnum',node,outs) |
324 | 356 | if getattr(self,'install_task',None): |
325 | 357 | self.install_task.hasrun=Task.SKIP_ME |
326 | bld=self.bld | |
327 | path=self.install_task.dest | |
358 | path=self.install_task.install_to | |
328 | 359 | if self.env.DEST_OS=='openbsd': |
329 | 360 | libname=self.link_task.outputs[0].name |
330 | t1=bld.install_as('%s%s%s'%(path,os.sep,libname),node,env=self.env,chmod=self.link_task.chmod) | |
361 | t1=self.add_install_as(install_to='%s/%s'%(path,libname),install_from=node,chmod=self.link_task.chmod) | |
331 | 362 | self.vnum_install_task=(t1,) |
332 | 363 | else: |
333 | t1=bld.install_as(path+os.sep+name3,node,env=self.env,chmod=self.link_task.chmod) | |
334 | t3=bld.symlink_as(path+os.sep+libname,name3) | |
364 | t1=self.add_install_as(install_to=path+os.sep+name3,install_from=node,chmod=self.link_task.chmod) | |
365 | t3=self.add_symlink_as(install_to=path+os.sep+libname,install_from=name3) | |
335 | 366 | if name2!=name3: |
336 | t2=bld.symlink_as(path+os.sep+name2,name3) | |
367 | t2=self.add_symlink_as(install_to=path+os.sep+name2,install_from=name3) | |
337 | 368 | self.vnum_install_task=(t1,t2,t3) |
338 | 369 | else: |
339 | 370 | self.vnum_install_task=(t1,t3) |
340 | if'-dynamiclib'in self.env['LINKFLAGS']: | |
371 | if'-dynamiclib'in self.env.LINKFLAGS: | |
341 | 372 | try: |
342 | 373 | inst_to=self.install_path |
343 | 374 | except AttributeError: |
350 | 381 | self.env.append_value('LINKFLAGS','-Wl,-current_version,%s'%self.vnum) |
351 | 382 | class vnum(Task.Task): |
352 | 383 | color='CYAN' |
353 | quient=True | |
354 | 384 | ext_in=['.bin'] |
355 | 385 | def keyword(self): |
356 | 386 | return'Symlinking' |
370 | 400 | for t in self.run_after: |
371 | 401 | if not t.hasrun: |
372 | 402 | return Task.ASK_LATER |
373 | for x in self.outputs: | |
374 | x.sig=Utils.h_file(x.abspath()) | |
375 | 403 | return Task.SKIP_ME |
376 | 404 | class fake_stlib(stlink_task): |
377 | 405 | def runnable_status(self): |
378 | 406 | for t in self.run_after: |
379 | 407 | if not t.hasrun: |
380 | 408 | return Task.ASK_LATER |
381 | for x in self.outputs: | |
382 | x.sig=Utils.h_file(x.abspath()) | |
383 | 409 | return Task.SKIP_ME |
384 | 410 | @conf |
385 | 411 | def read_shlib(self,name,paths=[],export_includes=[],export_defines=[]): |
400 | 426 | for y in names: |
401 | 427 | node=x.find_node(y) |
402 | 428 | if node: |
403 | node.sig=Utils.h_file(node.abspath()) | |
429 | try: | |
430 | Utils.h_file(node.abspath()) | |
431 | except EnvironmentError: | |
432 | raise ValueError('Could not read %r'%y) | |
404 | 433 | break |
405 | 434 | else: |
406 | 435 | continue |
5 | 5 | from waflib.Tools import ccroot |
6 | 6 | from waflib import Utils |
7 | 7 | from waflib.Logs import debug |
8 | c_compiler={'win32':['msvc','gcc','clang'],'cygwin':['gcc'],'darwin':['clang','gcc'],'aix':['xlc','gcc','clang'],'linux':['gcc','clang','icc'],'sunos':['suncc','gcc'],'irix':['gcc','irixcc'],'hpux':['gcc'],'osf1V':['gcc'],'gnu':['gcc','clang'],'java':['gcc','msvc','clang','icc'],'default':['gcc','clang'],} | |
8 | c_compiler={'win32':['msvc','gcc','clang'],'cygwin':['gcc'],'darwin':['clang','gcc'],'aix':['xlc','gcc','clang'],'linux':['gcc','clang','icc'],'sunos':['suncc','gcc'],'irix':['gcc','irixcc'],'hpux':['gcc'],'osf1V':['gcc'],'gnu':['gcc','clang'],'java':['gcc','msvc','clang','icc'],'default':['clang','gcc'],} | |
9 | 9 | def default_compilers(): |
10 | 10 | build_platform=Utils.unversioned_sys_platform() |
11 | 11 | possible_compiler_list=c_compiler.get(build_platform,c_compiler['default']) |
12 | 12 | return' '.join(possible_compiler_list) |
13 | 13 | def configure(conf): |
14 | try:test_for_compiler=conf.options.check_c_compiler or default_compilers() | |
15 | except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_c')") | |
14 | try: | |
15 | test_for_compiler=conf.options.check_c_compiler or default_compilers() | |
16 | except AttributeError: | |
17 | conf.fatal("Add options(opt): opt.load('compiler_c')") | |
16 | 18 | for compiler in re.split('[ ,]+',test_for_compiler): |
17 | 19 | conf.env.stash() |
18 | 20 | conf.start_msg('Checking for %r (C compiler)'%compiler) |
21 | 23 | except conf.errors.ConfigurationError as e: |
22 | 24 | conf.env.revert() |
23 | 25 | conf.end_msg(False) |
24 | debug('compiler_c: %r'%e) | |
26 | debug('compiler_c: %r',e) | |
25 | 27 | else: |
26 | if conf.env['CC']: | |
28 | if conf.env.CC: | |
27 | 29 | conf.end_msg(conf.env.get_flat('CC')) |
28 | conf.env['COMPILER_CC']=compiler | |
30 | conf.env.COMPILER_CC=compiler | |
31 | conf.env.commit() | |
29 | 32 | break |
33 | conf.env.revert() | |
30 | 34 | conf.end_msg(False) |
31 | 35 | else: |
32 | 36 | conf.fatal('could not configure a C compiler!') |
5 | 5 | from waflib.Tools import ccroot |
6 | 6 | from waflib import Utils |
7 | 7 | from waflib.Logs import debug |
8 | cxx_compiler={'win32':['msvc','g++','clang++'],'cygwin':['g++'],'darwin':['clang++','g++'],'aix':['xlc++','g++','clang++'],'linux':['g++','clang++','icpc'],'sunos':['sunc++','g++'],'irix':['g++'],'hpux':['g++'],'osf1V':['g++'],'gnu':['g++','clang++'],'java':['g++','msvc','clang++','icpc'],'default':['g++','clang++']} | |
8 | cxx_compiler={'win32':['msvc','g++','clang++'],'cygwin':['g++'],'darwin':['clang++','g++'],'aix':['xlc++','g++','clang++'],'linux':['g++','clang++','icpc'],'sunos':['sunc++','g++'],'irix':['g++'],'hpux':['g++'],'osf1V':['g++'],'gnu':['g++','clang++'],'java':['g++','msvc','clang++','icpc'],'default':['clang++','g++']} | |
9 | 9 | def default_compilers(): |
10 | 10 | build_platform=Utils.unversioned_sys_platform() |
11 | 11 | possible_compiler_list=cxx_compiler.get(build_platform,cxx_compiler['default']) |
12 | 12 | return' '.join(possible_compiler_list) |
13 | 13 | def configure(conf): |
14 | try:test_for_compiler=conf.options.check_cxx_compiler or default_compilers() | |
15 | except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_cxx')") | |
14 | try: | |
15 | test_for_compiler=conf.options.check_cxx_compiler or default_compilers() | |
16 | except AttributeError: | |
17 | conf.fatal("Add options(opt): opt.load('compiler_cxx')") | |
16 | 18 | for compiler in re.split('[ ,]+',test_for_compiler): |
17 | 19 | conf.env.stash() |
18 | 20 | conf.start_msg('Checking for %r (C++ compiler)'%compiler) |
21 | 23 | except conf.errors.ConfigurationError as e: |
22 | 24 | conf.env.revert() |
23 | 25 | conf.end_msg(False) |
24 | debug('compiler_cxx: %r'%e) | |
26 | debug('compiler_cxx: %r',e) | |
25 | 27 | else: |
26 | if conf.env['CXX']: | |
28 | if conf.env.CXX: | |
27 | 29 | conf.end_msg(conf.env.get_flat('CXX')) |
28 | conf.env['COMPILER_CXX']=compiler | |
30 | conf.env.COMPILER_CXX=compiler | |
31 | conf.env.commit() | |
29 | 32 | break |
33 | conf.env.revert() | |
30 | 34 | conf.end_msg(False) |
31 | 35 | else: |
32 | 36 | conf.fatal('could not configure a C++ compiler!') |
9 | 9 | possible_compiler_list=d_compiler.get(build_platform,d_compiler['default']) |
10 | 10 | return' '.join(possible_compiler_list) |
11 | 11 | def configure(conf): |
12 | try:test_for_compiler=conf.options.check_d_compiler or default_compilers() | |
13 | except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_d')") | |
12 | try: | |
13 | test_for_compiler=conf.options.check_d_compiler or default_compilers() | |
14 | except AttributeError: | |
15 | conf.fatal("Add options(opt): opt.load('compiler_d')") | |
14 | 16 | for compiler in re.split('[ ,]+',test_for_compiler): |
15 | 17 | conf.env.stash() |
16 | 18 | conf.start_msg('Checking for %r (D compiler)'%compiler) |
19 | 21 | except conf.errors.ConfigurationError as e: |
20 | 22 | conf.env.revert() |
21 | 23 | conf.end_msg(False) |
22 | Logs.debug('compiler_d: %r'%e) | |
24 | Logs.debug('compiler_d: %r',e) | |
23 | 25 | else: |
24 | 26 | if conf.env.D: |
25 | 27 | conf.end_msg(conf.env.get_flat('D')) |
26 | conf.env['COMPILER_D']=compiler | |
28 | conf.env.COMPILER_D=compiler | |
29 | conf.env.commit() | |
27 | 30 | break |
31 | conf.env.revert() | |
28 | 32 | conf.end_msg(False) |
29 | 33 | else: |
30 | 34 | conf.fatal('could not configure a D compiler!') |
10 | 10 | possible_compiler_list=fc_compiler.get(build_platform,fc_compiler['default']) |
11 | 11 | return' '.join(possible_compiler_list) |
12 | 12 | def configure(conf): |
13 | try:test_for_compiler=conf.options.check_fortran_compiler or default_compilers() | |
14 | except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_fc')") | |
13 | try: | |
14 | test_for_compiler=conf.options.check_fortran_compiler or default_compilers() | |
15 | except AttributeError: | |
16 | conf.fatal("Add options(opt): opt.load('compiler_fc')") | |
15 | 17 | for compiler in re.split('[ ,]+',test_for_compiler): |
16 | 18 | conf.env.stash() |
17 | 19 | conf.start_msg('Checking for %r (Fortran compiler)'%compiler) |
20 | 22 | except conf.errors.ConfigurationError as e: |
21 | 23 | conf.env.revert() |
22 | 24 | conf.end_msg(False) |
23 | Logs.debug('compiler_fortran: %r'%e) | |
25 | Logs.debug('compiler_fortran: %r',e) | |
24 | 26 | else: |
25 | if conf.env['FC']: | |
27 | if conf.env.FC: | |
26 | 28 | conf.end_msg(conf.env.get_flat('FC')) |
27 | 29 | conf.env.COMPILER_FORTRAN=compiler |
30 | conf.env.commit() | |
28 | 31 | break |
32 | conf.env.revert() | |
29 | 33 | conf.end_msg(False) |
30 | 34 | else: |
31 | 35 | conf.fatal('could not configure a Fortran compiler!') |
5 | 5 | from waflib.TaskGen import before_method,after_method,feature |
6 | 6 | from waflib.Tools import ccroot |
7 | 7 | from waflib.Configure import conf |
8 | import os,tempfile | |
9 | 8 | ccroot.USELIB_VARS['cs']=set(['CSFLAGS','ASSEMBLIES','RESOURCES']) |
10 | 9 | ccroot.lib_patterns['csshlib']=['%s'] |
11 | 10 | @feature('cs') |
27 | 26 | inst_to=getattr(self,'install_path',bintype=='exe'and'${BINDIR}'or'${LIBDIR}') |
28 | 27 | if inst_to: |
29 | 28 | mod=getattr(self,'chmod',bintype=='exe'and Utils.O755 or Utils.O644) |
30 | self.install_task=self.bld.install_files(inst_to,self.cs_task.outputs[:],env=self.env,chmod=mod) | |
29 | self.install_task=self.add_install_files(install_to=inst_to,install_from=self.cs_task.outputs[:],chmod=mod) | |
31 | 30 | @feature('cs') |
32 | 31 | @after_method('apply_cs') |
33 | 32 | def use_cs(self): |
73 | 72 | color='YELLOW' |
74 | 73 | run_str='${MCS} ${CSTYPE} ${CSFLAGS} ${ASS_ST:ASSEMBLIES} ${RES_ST:RESOURCES} ${OUT} ${SRC}' |
75 | 74 | def exec_command(self,cmd,**kw): |
76 | bld=self.generator.bld | |
77 | try: | |
78 | if not kw.get('cwd',None): | |
79 | kw['cwd']=bld.cwd | |
80 | except AttributeError: | |
81 | bld.cwd=kw['cwd']=bld.variant_dir | |
82 | try: | |
83 | tmp=None | |
84 | if isinstance(cmd,list)and len(' '.join(cmd))>=8192: | |
85 | program=cmd[0] | |
86 | cmd=[self.quote_response_command(x)for x in cmd] | |
87 | (fd,tmp)=tempfile.mkstemp() | |
88 | os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:]).encode()) | |
89 | os.close(fd) | |
90 | cmd=[program,'@'+tmp] | |
91 | ret=self.generator.bld.exec_command(cmd,**kw) | |
92 | finally: | |
93 | if tmp: | |
94 | try: | |
95 | os.remove(tmp) | |
96 | except OSError: | |
97 | pass | |
98 | return ret | |
99 | def quote_response_command(self,flag): | |
100 | if flag.lower()=='/noconfig': | |
101 | return'' | |
102 | if flag.find(' ')>-1: | |
103 | for x in('/r:','/reference:','/resource:','/lib:','/out:'): | |
104 | if flag.startswith(x): | |
105 | flag='%s"%s"'%(x,'","'.join(flag[len(x):].split(','))) | |
106 | break | |
107 | else: | |
108 | flag='"%s"'%flag | |
109 | return flag | |
75 | if'/noconfig'in cmd: | |
76 | raise ValueError('/noconfig is not allowed when using response files, check your flags!') | |
77 | return super(self.__class__,self).exec_command(cmd,**kw) | |
110 | 78 | def configure(conf): |
111 | 79 | csc=getattr(Options.options,'cscbinary',None) |
112 | 80 | if csc: |
123 | 91 | color='YELLOW' |
124 | 92 | inst_to=None |
125 | 93 | def runnable_status(self): |
126 | for x in self.outputs: | |
127 | x.sig=Utils.h_file(x.abspath()) | |
128 | 94 | return Task.SKIP_ME |
129 | 95 | @conf |
130 | 96 | def read_csshlib(self,name,paths=[]): |
10 | 10 | if not'.c'in TaskGen.task_gen.mappings: |
11 | 11 | TaskGen.task_gen.mappings['.c']=TaskGen.task_gen.mappings['.cpp'] |
12 | 12 | class cxx(Task.Task): |
13 | run_str='${CXX} ${ARCH_ST:ARCH} ${CXXFLAGS} ${CPPFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CXX_SRC_F}${SRC} ${CXX_TGT_F}${TGT[0].abspath()}' | |
13 | run_str='${CXX} ${ARCH_ST:ARCH} ${CXXFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CXX_SRC_F}${SRC} ${CXX_TGT_F}${TGT[0].abspath()} ${CPPFLAGS}' | |
14 | 14 | vars=['CXXDEPS'] |
15 | 15 | ext_in=['.h'] |
16 | 16 | scan=c_preproc.scan |
34 | 34 | return task |
35 | 35 | if getattr(self,'generate_headers',None): |
36 | 36 | tsk=create_compiled_task(self,'d_with_header',node) |
37 | tsk.outputs.append(node.change_ext(self.env['DHEADER_ext'])) | |
37 | tsk.outputs.append(node.change_ext(self.env.DHEADER_ext)) | |
38 | 38 | else: |
39 | 39 | tsk=create_compiled_task(self,'d',node) |
40 | 40 | return tsk |
10 | 10 | v.DEST_OS=Utils.unversioned_sys_platform() |
11 | 11 | binfmt=Utils.destos_to_binfmt(self.env.DEST_OS) |
12 | 12 | if binfmt=='pe': |
13 | v['dprogram_PATTERN']='%s.exe' | |
14 | v['dshlib_PATTERN']='lib%s.dll' | |
15 | v['dstlib_PATTERN']='lib%s.a' | |
13 | v.dprogram_PATTERN='%s.exe' | |
14 | v.dshlib_PATTERN='lib%s.dll' | |
15 | v.dstlib_PATTERN='lib%s.a' | |
16 | 16 | elif binfmt=='mac-o': |
17 | v['dprogram_PATTERN']='%s' | |
18 | v['dshlib_PATTERN']='lib%s.dylib' | |
19 | v['dstlib_PATTERN']='lib%s.a' | |
17 | v.dprogram_PATTERN='%s' | |
18 | v.dshlib_PATTERN='lib%s.dylib' | |
19 | v.dstlib_PATTERN='lib%s.a' | |
20 | 20 | else: |
21 | v['dprogram_PATTERN']='%s' | |
22 | v['dshlib_PATTERN']='lib%s.so' | |
23 | v['dstlib_PATTERN']='lib%s.a' | |
21 | v.dprogram_PATTERN='%s' | |
22 | v.dshlib_PATTERN='lib%s.so' | |
23 | v.dstlib_PATTERN='lib%s.a' | |
24 | 24 | DLIB=''' |
25 | 25 | version(D_Version2) { |
26 | 26 | import std.stdio; |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | 4 | import re |
5 | from waflib import Utils,Logs | |
5 | from waflib import Utils | |
6 | 6 | def filter_comments(filename): |
7 | 7 | txt=Utils.readf(filename) |
8 | 8 | i=0 |
127 | 127 | gruik.start(node) |
128 | 128 | nodes=gruik.nodes |
129 | 129 | names=gruik.names |
130 | if Logs.verbose: | |
131 | Logs.debug('deps: deps for %s: %r; unresolved %r'%(str(node),nodes,names)) | |
132 | 130 | return(nodes,names) |
15 | 15 | @conf |
16 | 16 | def common_flags_ldc(conf): |
17 | 17 | v=conf.env |
18 | v['DFLAGS']=['-d-version=Posix'] | |
19 | v['LINKFLAGS']=[] | |
20 | v['DFLAGS_dshlib']=['-relocation-model=pic'] | |
18 | v.DFLAGS=['-d-version=Posix'] | |
19 | v.LINKFLAGS=[] | |
20 | v.DFLAGS_dshlib=['-relocation-model=pic'] | |
21 | 21 | @conf |
22 | 22 | def common_flags_dmd(conf): |
23 | 23 | v=conf.env |
24 | v['D_SRC_F']=['-c'] | |
25 | v['D_TGT_F']='-of%s' | |
26 | v['D_LINKER']=v['D'] | |
27 | v['DLNK_SRC_F']='' | |
28 | v['DLNK_TGT_F']='-of%s' | |
29 | v['DINC_ST']='-I%s' | |
30 | v['DSHLIB_MARKER']=v['DSTLIB_MARKER']='' | |
31 | v['DSTLIB_ST']=v['DSHLIB_ST']='-L-l%s' | |
32 | v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L-L%s' | |
33 | v['LINKFLAGS_dprogram']=['-quiet'] | |
34 | v['DFLAGS_dshlib']=['-fPIC'] | |
35 | v['LINKFLAGS_dshlib']=['-L-shared'] | |
36 | v['DHEADER_ext']='.di' | |
24 | v.D_SRC_F=['-c'] | |
25 | v.D_TGT_F='-of%s' | |
26 | v.D_LINKER=v.D | |
27 | v.DLNK_SRC_F='' | |
28 | v.DLNK_TGT_F='-of%s' | |
29 | v.DINC_ST='-I%s' | |
30 | v.DSHLIB_MARKER=v.DSTLIB_MARKER='' | |
31 | v.DSTLIB_ST=v.DSHLIB_ST='-L-l%s' | |
32 | v.DSTLIBPATH_ST=v.DLIBPATH_ST='-L-L%s' | |
33 | v.LINKFLAGS_dprogram=['-quiet'] | |
34 | v.DFLAGS_dshlib=['-fPIC'] | |
35 | v.LINKFLAGS_dshlib=['-L-shared'] | |
36 | v.DHEADER_ext='.di' | |
37 | 37 | v.DFLAGS_d_with_header=['-H','-Hf'] |
38 | v['D_HDR_F']='%s' | |
38 | v.D_HDR_F='%s' | |
39 | 39 | def configure(conf): |
40 | 40 | conf.find_dmd() |
41 | 41 | if sys.platform=='win32': |
42 | 42 | out=conf.cmd_and_log(conf.env.D+['--help']) |
43 | if out.find("D Compiler v2.")>-1: | |
43 | if out.find('D Compiler v2.')>-1: | |
44 | 44 | conf.fatal('dmd2 on Windows is not supported, use gdc or ldc2 instead') |
45 | 45 | conf.load('ar') |
46 | 46 | conf.load('d') |
5 | 5 | meths_typos=['__call__','program','shlib','stlib','objects'] |
6 | 6 | import sys |
7 | 7 | from waflib import Logs,Build,Node,Task,TaskGen,ConfigSet,Errors,Utils |
8 | import waflib.Tools.ccroot | |
8 | from waflib.Tools import ccroot | |
9 | 9 | def check_same_targets(self): |
10 | 10 | mp=Utils.defaultdict(list) |
11 | 11 | uids={} |
12 | 12 | def check_task(tsk): |
13 | 13 | if not isinstance(tsk,Task.Task): |
14 | return | |
15 | if hasattr(tsk,'no_errcheck_out'): | |
14 | 16 | return |
15 | 17 | for node in tsk.outputs: |
16 | 18 | mp[node].append(tsk) |
33 | 35 | Logs.error(msg) |
34 | 36 | for x in v: |
35 | 37 | if Logs.verbose>1: |
36 | Logs.error(' %d. %r'%(1+v.index(x),x.generator)) | |
38 | Logs.error(' %d. %r',1+v.index(x),x.generator) | |
37 | 39 | else: |
38 | Logs.error(' %d. %r in %r'%(1+v.index(x),x.generator.name,getattr(x.generator,'path',None))) | |
40 | Logs.error(' %d. %r in %r',1+v.index(x),x.generator.name,getattr(x.generator,'path',None)) | |
41 | Logs.error('If you think that this is an error, set no_errcheck_out on the task instance') | |
39 | 42 | if not dupe: |
40 | 43 | for(k,v)in uids.items(): |
41 | 44 | if len(v)>1: |
42 | 45 | Logs.error('* Several tasks use the same identifier. Please check the information on\n https://waf.io/apidocs/Task.html?highlight=uid#waflib.Task.Task.uid') |
43 | 46 | for tsk in v: |
44 | Logs.error(' - object %r (%r) defined in %r'%(tsk.__class__.__name__,tsk,tsk.generator)) | |
47 | Logs.error(' - object %r (%r) defined in %r',tsk.__class__.__name__,tsk,tsk.generator) | |
45 | 48 | def check_invalid_constraints(self): |
46 | feat=set([]) | |
49 | feat=set() | |
47 | 50 | for x in list(TaskGen.feats.values()): |
48 | 51 | feat.union(set(x)) |
49 | 52 | for(x,y)in TaskGen.task_gen.prec.items(): |
50 | 53 | feat.add(x) |
51 | 54 | feat.union(set(y)) |
52 | ext=set([]) | |
55 | ext=set() | |
53 | 56 | for x in TaskGen.task_gen.mappings.values(): |
54 | 57 | ext.add(x.__name__) |
55 | 58 | invalid=ext&feat |
56 | 59 | if invalid: |
57 | Logs.error('The methods %r have invalid annotations: @extension <-> @feature/@before_method/@after_method'%list(invalid)) | |
60 | Logs.error('The methods %r have invalid annotations: @extension <-> @feature/@before_method/@after_method',list(invalid)) | |
58 | 61 | for cls in list(Task.classes.values()): |
59 | 62 | if sys.hexversion>0x3000000 and issubclass(cls,Task.Task)and isinstance(cls.hcode,str): |
60 | 63 | raise Errors.WafError('Class %r has hcode value %r of type <str>, expecting <bytes> (use Utils.h_cmd() ?)'%(cls,cls.hcode)) |
61 | 64 | for x in('before','after'): |
62 | 65 | for y in Utils.to_list(getattr(cls,x,[])): |
63 | if not Task.classes.get(y,None): | |
64 | Logs.error('Erroneous order constraint %r=%r on task class %r'%(x,y,cls.__name__)) | |
66 | if not Task.classes.get(y): | |
67 | Logs.error('Erroneous order constraint %r=%r on task class %r',x,y,cls.__name__) | |
65 | 68 | if getattr(cls,'rule',None): |
66 | Logs.error('Erroneous attribute "rule" on task class %r (rename to "run_str")'%cls.__name__) | |
69 | Logs.error('Erroneous attribute "rule" on task class %r (rename to "run_str")',cls.__name__) | |
67 | 70 | def replace(m): |
68 | 71 | oldcall=getattr(Build.BuildContext,m) |
69 | 72 | def call(self,*k,**kw): |
72 | 75 | if x in kw: |
73 | 76 | if x=='iscopy'and'subst'in getattr(self,'features',''): |
74 | 77 | continue |
75 | Logs.error('Fix the typo %r -> %r on %r'%(x,typos[x],ret)) | |
78 | Logs.error('Fix the typo %r -> %r on %r',x,typos[x],ret) | |
76 | 79 | return ret |
77 | 80 | setattr(Build.BuildContext,m,call) |
78 | 81 | def enhance_lib(): |
82 | 85 | if k: |
83 | 86 | lst=Utils.to_list(k[0]) |
84 | 87 | for pat in lst: |
85 | if'..'in pat.split('/'): | |
86 | Logs.error("In ant_glob pattern %r: '..' means 'two dots', not 'parent directory'"%k[0]) | |
88 | sp=pat.split('/') | |
89 | if'..'in sp: | |
90 | Logs.error("In ant_glob pattern %r: '..' means 'two dots', not 'parent directory'",k[0]) | |
91 | if'.'in sp: | |
92 | Logs.error("In ant_glob pattern %r: '.' means 'one dot', not 'current directory'",k[0]) | |
87 | 93 | if kw.get('remove',True): |
88 | 94 | try: |
89 | 95 | if self.is_child_of(self.ctx.bldnode)and not kw.get('quiet',False): |
90 | Logs.error('Using ant_glob on the build folder (%r) is dangerous (quiet=True to disable this warning)'%self) | |
96 | Logs.error('Using ant_glob on the build folder (%r) is dangerous (quiet=True to disable this warning)',self) | |
91 | 97 | except AttributeError: |
92 | 98 | pass |
93 | 99 | return self.old_ant_glob(*k,**kw) |
97 | 103 | def is_before(t1,t2): |
98 | 104 | ret=old(t1,t2) |
99 | 105 | if ret and old(t2,t1): |
100 | Logs.error('Contradictory order constraints in classes %r %r'%(t1,t2)) | |
106 | Logs.error('Contradictory order constraints in classes %r %r',t1,t2) | |
101 | 107 | return ret |
102 | 108 | Task.is_before=is_before |
103 | 109 | def check_err_features(self): |
106 | 112 | Logs.error('feature shlib -> cshlib, dshlib or cxxshlib') |
107 | 113 | for x in('c','cxx','d','fc'): |
108 | 114 | if not x in lst and lst and lst[0]in[x+y for y in('program','shlib','stlib')]: |
109 | Logs.error('%r features is probably missing %r'%(self,x)) | |
115 | Logs.error('%r features is probably missing %r',self,x) | |
110 | 116 | TaskGen.feature('*')(check_err_features) |
111 | 117 | def check_err_order(self): |
112 | 118 | if not hasattr(self,'rule')and not'subst'in Utils.to_list(self.features): |
113 | 119 | for x in('before','after','ext_in','ext_out'): |
114 | 120 | if hasattr(self,x): |
115 | Logs.warn('Erroneous order constraint %r on non-rule based task generator %r'%(x,self)) | |
121 | Logs.warn('Erroneous order constraint %r on non-rule based task generator %r',x,self) | |
116 | 122 | else: |
117 | 123 | for x in('before','after'): |
118 | 124 | for y in self.to_list(getattr(self,x,[])): |
119 | 125 | if not Task.classes.get(y,None): |
120 | Logs.error('Erroneous order constraint %s=%r on %r (no such class)'%(x,y,self)) | |
126 | Logs.error('Erroneous order constraint %s=%r on %r (no such class)',x,y,self) | |
121 | 127 | TaskGen.feature('*')(check_err_order) |
122 | 128 | def check_compile(self): |
123 | 129 | check_invalid_constraints(self) |
146 | 152 | self.orig_use_rec(name,**kw) |
147 | 153 | TaskGen.task_gen.orig_use_rec=TaskGen.task_gen.use_rec |
148 | 154 | TaskGen.task_gen.use_rec=use_rec |
149 | def getattri(self,name,default=None): | |
155 | def _getattr(self,name,default=None): | |
150 | 156 | if name=='append'or name=='add': |
151 | 157 | raise Errors.WafError('env.append and env.add do not exist: use env.append_value/env.append_unique') |
152 | 158 | elif name=='prepend': |
155 | 161 | return object.__getattr__(self,name,default) |
156 | 162 | else: |
157 | 163 | return self[name] |
158 | ConfigSet.ConfigSet.__getattr__=getattri | |
164 | ConfigSet.ConfigSet.__getattr__=_getattr | |
159 | 165 | def options(opt): |
160 | 166 | enhance_lib() |
161 | def configure(conf): | |
162 | pass |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | from waflib import Utils,Task,Logs | |
4 | from waflib import Utils,Task | |
5 | 5 | from waflib.Tools import ccroot,fc_config,fc_scan |
6 | from waflib.TaskGen import feature,extension | |
6 | from waflib.TaskGen import extension | |
7 | 7 | from waflib.Configure import conf |
8 | ccroot.USELIB_VARS['fc']=set(['FCFLAGS','DEFINES','INCLUDES']) | |
8 | ccroot.USELIB_VARS['fc']=set(['FCFLAGS','DEFINES','INCLUDES','FCPPFLAGS']) | |
9 | 9 | ccroot.USELIB_VARS['fcprogram_test']=ccroot.USELIB_VARS['fcprogram']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS']) |
10 | 10 | ccroot.USELIB_VARS['fcshlib']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS']) |
11 | 11 | ccroot.USELIB_VARS['fcstlib']=set(['ARFLAGS','LINKDEPS']) |
12 | @feature('fcprogram','fcshlib','fcstlib','fcprogram_test') | |
13 | def dummy(self): | |
14 | pass | |
15 | 12 | @extension('.f','.f90','.F','.F90','.for','.FOR') |
16 | 13 | def fc_hook(self,node): |
17 | 14 | return self.create_compiled_task('fc',node) |
24 | 21 | return[x for x in tasks if isinstance(x,fc)and not getattr(x,'nomod',None)and not getattr(x,'mod_fortran_done',None)] |
25 | 22 | class fc(Task.Task): |
26 | 23 | color='GREEN' |
27 | run_str='${FC} ${FCFLAGS} ${FCINCPATH_ST:INCPATHS} ${FCDEFINES_ST:DEFINES} ${_FCMODOUTFLAGS} ${FC_TGT_F}${TGT[0].abspath()} ${FC_SRC_F}${SRC[0].abspath()}' | |
24 | run_str='${FC} ${FCFLAGS} ${FCINCPATH_ST:INCPATHS} ${FCDEFINES_ST:DEFINES} ${_FCMODOUTFLAGS} ${FC_TGT_F}${TGT[0].abspath()} ${FC_SRC_F}${SRC[0].abspath()} ${FCPPFLAGS}' | |
28 | 25 | vars=["FORTRANMODPATHFLAG"] |
29 | 26 | def scan(self): |
30 | 27 | tmp=fc_scan.fortran_parser(self.generator.includes_nodes) |
31 | 28 | tmp.task=self |
32 | 29 | tmp.start(self.inputs[0]) |
33 | if Logs.verbose: | |
34 | Logs.debug('deps: deps for %r: %r; unresolved %r'%(self.inputs,tmp.nodes,tmp.names)) | |
35 | 30 | return(tmp.nodes,tmp.names) |
36 | 31 | def runnable_status(self): |
37 | 32 | if getattr(self,'mod_fortran_done',None): |
54 | 49 | if x.startswith('MOD@'): |
55 | 50 | name=bld.modfile(x.replace('MOD@','')) |
56 | 51 | node=bld.srcnode.find_or_declare(name) |
57 | if not getattr(node,'sig',None): | |
58 | node.sig=Utils.SIG_NIL | |
59 | 52 | tsk.set_outputs(node) |
60 | 53 | outs[id(node)].add(tsk) |
61 | 54 | for tsk in lst: |
88 | 81 | inst_to='${BINDIR}' |
89 | 82 | class fcshlib(fcprogram): |
90 | 83 | inst_to='${LIBDIR}' |
84 | class fcstlib(ccroot.stlink_task): | |
85 | pass | |
91 | 86 | class fcprogram_test(fcprogram): |
92 | 87 | def runnable_status(self): |
93 | 88 | ret=super(fcprogram_test,self).runnable_status() |
98 | 93 | bld=self.generator.bld |
99 | 94 | kw['shell']=isinstance(cmd,str) |
100 | 95 | kw['stdout']=kw['stderr']=Utils.subprocess.PIPE |
101 | kw['cwd']=bld.variant_dir | |
96 | kw['cwd']=self.get_cwd() | |
102 | 97 | bld.out=bld.err='' |
103 | 98 | bld.to_log('command: %s\n'%cmd) |
104 | 99 | kw['output']=0 |
107 | 102 | except Exception: |
108 | 103 | return-1 |
109 | 104 | if bld.out: |
110 | bld.to_log("out: %s\n"%bld.out) | |
105 | bld.to_log('out: %s\n'%bld.out) | |
111 | 106 | if bld.err: |
112 | bld.to_log("err: %s\n"%bld.err) | |
113 | class fcstlib(ccroot.stlink_task): | |
114 | pass | |
107 | bld.to_log('err: %s\n'%bld.err) |
9 | 9 | @conf |
10 | 10 | def fc_flags(conf): |
11 | 11 | v=conf.env |
12 | v['FC_SRC_F']=[] | |
13 | v['FC_TGT_F']=['-c','-o'] | |
14 | v['FCINCPATH_ST']='-I%s' | |
15 | v['FCDEFINES_ST']='-D%s' | |
16 | if not v['LINK_FC']:v['LINK_FC']=v['FC'] | |
17 | v['FCLNK_SRC_F']=[] | |
18 | v['FCLNK_TGT_F']=['-o'] | |
19 | v['FCFLAGS_fcshlib']=['-fpic'] | |
20 | v['LINKFLAGS_fcshlib']=['-shared'] | |
21 | v['fcshlib_PATTERN']='lib%s.so' | |
22 | v['fcstlib_PATTERN']='lib%s.a' | |
23 | v['FCLIB_ST']='-l%s' | |
24 | v['FCLIBPATH_ST']='-L%s' | |
25 | v['FCSTLIB_ST']='-l%s' | |
26 | v['FCSTLIBPATH_ST']='-L%s' | |
27 | v['FCSTLIB_MARKER']='-Wl,-Bstatic' | |
28 | v['FCSHLIB_MARKER']='-Wl,-Bdynamic' | |
29 | v['SONAME_ST']='-Wl,-h,%s' | |
12 | v.FC_SRC_F=[] | |
13 | v.FC_TGT_F=['-c','-o'] | |
14 | v.FCINCPATH_ST='-I%s' | |
15 | v.FCDEFINES_ST='-D%s' | |
16 | if not v.LINK_FC: | |
17 | v.LINK_FC=v.FC | |
18 | v.FCLNK_SRC_F=[] | |
19 | v.FCLNK_TGT_F=['-o'] | |
20 | v.FCFLAGS_fcshlib=['-fpic'] | |
21 | v.LINKFLAGS_fcshlib=['-shared'] | |
22 | v.fcshlib_PATTERN='lib%s.so' | |
23 | v.fcstlib_PATTERN='lib%s.a' | |
24 | v.FCLIB_ST='-l%s' | |
25 | v.FCLIBPATH_ST='-L%s' | |
26 | v.FCSTLIB_ST='-l%s' | |
27 | v.FCSTLIBPATH_ST='-L%s' | |
28 | v.FCSTLIB_MARKER='-Wl,-Bstatic' | |
29 | v.FCSHLIB_MARKER='-Wl,-Bdynamic' | |
30 | v.SONAME_ST='-Wl,-h,%s' | |
30 | 31 | @conf |
31 | 32 | def fc_add_flags(conf): |
33 | conf.add_os_flags('FCPPFLAGS',dup=False) | |
32 | 34 | conf.add_os_flags('FCFLAGS',dup=False) |
33 | 35 | conf.add_os_flags('LINKFLAGS',dup=False) |
34 | 36 | conf.add_os_flags('LDFLAGS',dup=False) |
50 | 52 | @conf |
51 | 53 | def fortran_modifier_darwin(conf): |
52 | 54 | v=conf.env |
53 | v['FCFLAGS_fcshlib']=['-fPIC'] | |
54 | v['LINKFLAGS_fcshlib']=['-dynamiclib'] | |
55 | v['fcshlib_PATTERN']='lib%s.dylib' | |
56 | v['FRAMEWORKPATH_ST']='-F%s' | |
57 | v['FRAMEWORK_ST']='-framework %s' | |
58 | v['LINKFLAGS_fcstlib']=[] | |
59 | v['FCSHLIB_MARKER']='' | |
60 | v['FCSTLIB_MARKER']='' | |
61 | v['SONAME_ST']='' | |
55 | v.FCFLAGS_fcshlib=['-fPIC'] | |
56 | v.LINKFLAGS_fcshlib=['-dynamiclib'] | |
57 | v.fcshlib_PATTERN='lib%s.dylib' | |
58 | v.FRAMEWORKPATH_ST='-F%s' | |
59 | v.FRAMEWORK_ST='-framework %s' | |
60 | v.LINKFLAGS_fcstlib=[] | |
61 | v.FCSHLIB_MARKER='' | |
62 | v.FCSTLIB_MARKER='' | |
63 | v.SONAME_ST='' | |
62 | 64 | @conf |
63 | 65 | def fortran_modifier_win32(conf): |
64 | 66 | v=conf.env |
65 | v['fcprogram_PATTERN']=v['fcprogram_test_PATTERN']='%s.exe' | |
66 | v['fcshlib_PATTERN']='%s.dll' | |
67 | v['implib_PATTERN']='lib%s.dll.a' | |
68 | v['IMPLIB_ST']='-Wl,--out-implib,%s' | |
69 | v['FCFLAGS_fcshlib']=[] | |
70 | v.append_value('FCFLAGS_fcshlib',['-DDLL_EXPORT']) | |
67 | v.fcprogram_PATTERN=v.fcprogram_test_PATTERN='%s.exe' | |
68 | v.fcshlib_PATTERN='%s.dll' | |
69 | v.implib_PATTERN='lib%s.dll.a' | |
70 | v.IMPLIB_ST='-Wl,--out-implib,%s' | |
71 | v.FCFLAGS_fcshlib=[] | |
71 | 72 | v.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) |
72 | 73 | @conf |
73 | 74 | def fortran_modifier_cygwin(conf): |
74 | 75 | fortran_modifier_win32(conf) |
75 | 76 | v=conf.env |
76 | v['fcshlib_PATTERN']='cyg%s.dll' | |
77 | v.fcshlib_PATTERN='cyg%s.dll' | |
77 | 78 | v.append_value('LINKFLAGS_fcshlib',['-Wl,--enable-auto-image-base']) |
78 | v['FCFLAGS_fcshlib']=[] | |
79 | v.FCFLAGS_fcshlib=[] | |
79 | 80 | @conf |
80 | 81 | def check_fortran_dummy_main(self,*k,**kw): |
81 | 82 | if not self.env.CC: |
257 | 258 | self.start_msg('Getting fortran mangling scheme') |
258 | 259 | for(u,du,c)in mangling_schemes(): |
259 | 260 | try: |
260 | self.check_cc(compile_filename=[],features='link_main_routines_func',msg='nomsg',errmsg='nomsg',mandatory=True,dummy_func_nounder=mangle_name(u,du,c,"foobar"),dummy_func_under=mangle_name(u,du,c,"foo_bar"),main_func_name=self.env.FC_MAIN) | |
261 | self.check_cc(compile_filename=[],features='link_main_routines_func',msg='nomsg',errmsg='nomsg',dummy_func_nounder=mangle_name(u,du,c,'foobar'),dummy_func_under=mangle_name(u,du,c,'foo_bar'),main_func_name=self.env.FC_MAIN) | |
261 | 262 | except self.errors.ConfigurationError: |
262 | 263 | pass |
263 | 264 | else: |
271 | 272 | @feature('pyext') |
272 | 273 | @before_method('propagate_uselib_vars','apply_link') |
273 | 274 | def set_lib_pat(self): |
274 | self.env['fcshlib_PATTERN']=self.env['pyext_PATTERN'] | |
275 | self.env.fcshlib_PATTERN=self.env.pyext_PATTERN | |
275 | 276 | @conf |
276 | 277 | def detect_openmp(self): |
277 | 278 | for x in('-fopenmp','-openmp','-mp','-xopenmp','-omp','-qsmp=omp'): |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import waflib.TaskGen,os,re | |
4 | import os,re | |
5 | from waflib import Task,TaskGen | |
6 | from waflib.Tools import ccroot | |
5 | 7 | def decide_ext(self,node): |
6 | 8 | if'cxx'in self.features: |
7 | 9 | return['.lex.cc'] |
14 | 16 | if isinstance(xx,str):return[xx] |
15 | 17 | return xx |
16 | 18 | tsk.last_cmd=lst=[] |
17 | lst.extend(to_list(env['FLEX'])) | |
18 | lst.extend(to_list(env['FLEXFLAGS'])) | |
19 | inputs=[a.path_from(bld.bldnode)for a in tsk.inputs] | |
19 | lst.extend(to_list(env.FLEX)) | |
20 | lst.extend(to_list(env.FLEXFLAGS)) | |
21 | inputs=[a.path_from(tsk.get_cwd())for a in tsk.inputs] | |
20 | 22 | if env.FLEX_MSYS: |
21 | 23 | inputs=[x.replace(os.sep,'/')for x in inputs] |
22 | 24 | lst.extend(inputs) |
23 | 25 | lst=[x for x in lst if x] |
24 | 26 | txt=bld.cmd_and_log(lst,cwd=wd,env=env.env or None,quiet=0) |
25 | 27 | tsk.outputs[0].write(txt.replace('\r\n','\n').replace('\r','\n')) |
26 | waflib.TaskGen.declare_chain(name='flex',rule=flexfun,ext_in='.l',decider=decide_ext,) | |
28 | TaskGen.declare_chain(name='flex',rule=flexfun,ext_in='.l',decider=decide_ext,) | |
29 | Task.classes['flex'].vars=['FLEXFLAGS','FLEX'] | |
30 | ccroot.USELIB_VARS['c'].add('FLEXFLAGS') | |
31 | ccroot.USELIB_VARS['cxx'].add('FLEXFLAGS') | |
27 | 32 | def configure(conf): |
28 | 33 | conf.find_program('flex',var='FLEX') |
29 | 34 | conf.env.FLEXFLAGS=['-t'] |
13 | 13 | @conf |
14 | 14 | def g95_flags(conf): |
15 | 15 | v=conf.env |
16 | v['FCFLAGS_fcshlib']=['-fPIC'] | |
17 | v['FORTRANMODFLAG']=['-fmod=',''] | |
18 | v['FCFLAGS_DEBUG']=['-Werror'] | |
16 | v.FCFLAGS_fcshlib=['-fPIC'] | |
17 | v.FORTRANMODFLAG=['-fmod=',''] | |
18 | v.FCFLAGS_DEBUG=['-Werror'] | |
19 | 19 | @conf |
20 | 20 | def g95_modifier_win32(conf): |
21 | 21 | fc_config.fortran_modifier_win32(conf) |
27 | 27 | fc_config.fortran_modifier_darwin(conf) |
28 | 28 | @conf |
29 | 29 | def g95_modifier_platform(conf): |
30 | dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform() | |
30 | dest_os=conf.env.DEST_OS or Utils.unversioned_sys_platform() | |
31 | 31 | g95_modifier_func=getattr(conf,'g95_modifier_'+dest_os,None) |
32 | 32 | if g95_modifier_func: |
33 | 33 | g95_modifier_func() |
43 | 43 | if not match: |
44 | 44 | conf.fatal('cannot determine g95 version') |
45 | 45 | k=match.groupdict() |
46 | conf.env['FC_VERSION']=(k['major'],k['minor']) | |
46 | conf.env.FC_VERSION=(k['major'],k['minor']) | |
47 | 47 | def configure(conf): |
48 | 48 | conf.find_g95() |
49 | 49 | conf.find_ar() |
11 | 11 | @conf |
12 | 12 | def gcc_common_flags(conf): |
13 | 13 | v=conf.env |
14 | v['CC_SRC_F']=[] | |
15 | v['CC_TGT_F']=['-c','-o'] | |
16 | if not v['LINK_CC']:v['LINK_CC']=v['CC'] | |
17 | v['CCLNK_SRC_F']=[] | |
18 | v['CCLNK_TGT_F']=['-o'] | |
19 | v['CPPPATH_ST']='-I%s' | |
20 | v['DEFINES_ST']='-D%s' | |
21 | v['LIB_ST']='-l%s' | |
22 | v['LIBPATH_ST']='-L%s' | |
23 | v['STLIB_ST']='-l%s' | |
24 | v['STLIBPATH_ST']='-L%s' | |
25 | v['RPATH_ST']='-Wl,-rpath,%s' | |
26 | v['SONAME_ST']='-Wl,-h,%s' | |
27 | v['SHLIB_MARKER']='-Wl,-Bdynamic' | |
28 | v['STLIB_MARKER']='-Wl,-Bstatic' | |
29 | v['cprogram_PATTERN']='%s' | |
30 | v['CFLAGS_cshlib']=['-fPIC'] | |
31 | v['LINKFLAGS_cshlib']=['-shared'] | |
32 | v['cshlib_PATTERN']='lib%s.so' | |
33 | v['LINKFLAGS_cstlib']=['-Wl,-Bstatic'] | |
34 | v['cstlib_PATTERN']='lib%s.a' | |
35 | v['LINKFLAGS_MACBUNDLE']=['-bundle','-undefined','dynamic_lookup'] | |
36 | v['CFLAGS_MACBUNDLE']=['-fPIC'] | |
37 | v['macbundle_PATTERN']='%s.bundle' | |
14 | v.CC_SRC_F=[] | |
15 | v.CC_TGT_F=['-c','-o'] | |
16 | if not v.LINK_CC: | |
17 | v.LINK_CC=v.CC | |
18 | v.CCLNK_SRC_F=[] | |
19 | v.CCLNK_TGT_F=['-o'] | |
20 | v.CPPPATH_ST='-I%s' | |
21 | v.DEFINES_ST='-D%s' | |
22 | v.LIB_ST='-l%s' | |
23 | v.LIBPATH_ST='-L%s' | |
24 | v.STLIB_ST='-l%s' | |
25 | v.STLIBPATH_ST='-L%s' | |
26 | v.RPATH_ST='-Wl,-rpath,%s' | |
27 | v.SONAME_ST='-Wl,-h,%s' | |
28 | v.SHLIB_MARKER='-Wl,-Bdynamic' | |
29 | v.STLIB_MARKER='-Wl,-Bstatic' | |
30 | v.cprogram_PATTERN='%s' | |
31 | v.CFLAGS_cshlib=['-fPIC'] | |
32 | v.LINKFLAGS_cshlib=['-shared'] | |
33 | v.cshlib_PATTERN='lib%s.so' | |
34 | v.LINKFLAGS_cstlib=['-Wl,-Bstatic'] | |
35 | v.cstlib_PATTERN='lib%s.a' | |
36 | v.LINKFLAGS_MACBUNDLE=['-bundle','-undefined','dynamic_lookup'] | |
37 | v.CFLAGS_MACBUNDLE=['-fPIC'] | |
38 | v.macbundle_PATTERN='%s.bundle' | |
38 | 39 | @conf |
39 | 40 | def gcc_modifier_win32(conf): |
40 | 41 | v=conf.env |
41 | v['cprogram_PATTERN']='%s.exe' | |
42 | v['cshlib_PATTERN']='%s.dll' | |
43 | v['implib_PATTERN']='lib%s.dll.a' | |
44 | v['IMPLIB_ST']='-Wl,--out-implib,%s' | |
45 | v['CFLAGS_cshlib']=[] | |
42 | v.cprogram_PATTERN='%s.exe' | |
43 | v.cshlib_PATTERN='%s.dll' | |
44 | v.implib_PATTERN='lib%s.dll.a' | |
45 | v.IMPLIB_ST='-Wl,--out-implib,%s' | |
46 | v.CFLAGS_cshlib=[] | |
46 | 47 | v.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) |
47 | 48 | @conf |
48 | 49 | def gcc_modifier_cygwin(conf): |
49 | 50 | gcc_modifier_win32(conf) |
50 | 51 | v=conf.env |
51 | v['cshlib_PATTERN']='cyg%s.dll' | |
52 | v.cshlib_PATTERN='cyg%s.dll' | |
52 | 53 | v.append_value('LINKFLAGS_cshlib',['-Wl,--enable-auto-image-base']) |
53 | v['CFLAGS_cshlib']=[] | |
54 | v.CFLAGS_cshlib=[] | |
54 | 55 | @conf |
55 | 56 | def gcc_modifier_darwin(conf): |
56 | 57 | v=conf.env |
57 | v['CFLAGS_cshlib']=['-fPIC'] | |
58 | v['LINKFLAGS_cshlib']=['-dynamiclib'] | |
59 | v['cshlib_PATTERN']='lib%s.dylib' | |
60 | v['FRAMEWORKPATH_ST']='-F%s' | |
61 | v['FRAMEWORK_ST']=['-framework'] | |
62 | v['ARCH_ST']=['-arch'] | |
63 | v['LINKFLAGS_cstlib']=[] | |
64 | v['SHLIB_MARKER']=[] | |
65 | v['STLIB_MARKER']=[] | |
66 | v['SONAME_ST']=[] | |
58 | v.CFLAGS_cshlib=['-fPIC'] | |
59 | v.LINKFLAGS_cshlib=['-dynamiclib'] | |
60 | v.cshlib_PATTERN='lib%s.dylib' | |
61 | v.FRAMEWORKPATH_ST='-F%s' | |
62 | v.FRAMEWORK_ST=['-framework'] | |
63 | v.ARCH_ST=['-arch'] | |
64 | v.LINKFLAGS_cstlib=[] | |
65 | v.SHLIB_MARKER=[] | |
66 | v.STLIB_MARKER=[] | |
67 | v.SONAME_ST=[] | |
67 | 68 | @conf |
68 | 69 | def gcc_modifier_aix(conf): |
69 | 70 | v=conf.env |
70 | v['LINKFLAGS_cprogram']=['-Wl,-brtl'] | |
71 | v['LINKFLAGS_cshlib']=['-shared','-Wl,-brtl,-bexpfull'] | |
72 | v['SHLIB_MARKER']=[] | |
71 | v.LINKFLAGS_cprogram=['-Wl,-brtl'] | |
72 | v.LINKFLAGS_cshlib=['-shared','-Wl,-brtl,-bexpfull'] | |
73 | v.SHLIB_MARKER=[] | |
73 | 74 | @conf |
74 | 75 | def gcc_modifier_hpux(conf): |
75 | 76 | v=conf.env |
76 | v['SHLIB_MARKER']=[] | |
77 | v['STLIB_MARKER']=[] | |
78 | v['CFLAGS_cshlib']=['-fPIC','-DPIC'] | |
79 | v['cshlib_PATTERN']='lib%s.sl' | |
77 | v.SHLIB_MARKER=[] | |
78 | v.STLIB_MARKER=[] | |
79 | v.CFLAGS_cshlib=['-fPIC','-DPIC'] | |
80 | v.cshlib_PATTERN='lib%s.sl' | |
80 | 81 | @conf |
81 | 82 | def gcc_modifier_openbsd(conf): |
82 | 83 | conf.env.SONAME_ST=[] |
83 | 84 | @conf |
84 | 85 | def gcc_modifier_osf1V(conf): |
85 | 86 | v=conf.env |
86 | v['SHLIB_MARKER']=[] | |
87 | v['STLIB_MARKER']=[] | |
88 | v['SONAME_ST']=[] | |
87 | v.SHLIB_MARKER=[] | |
88 | v.STLIB_MARKER=[] | |
89 | v.SONAME_ST=[] | |
89 | 90 | @conf |
90 | 91 | def gcc_modifier_platform(conf): |
91 | 92 | gcc_modifier_func=getattr(conf,'gcc_modifier_'+conf.env.DEST_OS,None) |
12 | 12 | @conf |
13 | 13 | def common_flags_gdc(conf): |
14 | 14 | v=conf.env |
15 | v['DFLAGS']=[] | |
16 | v['D_SRC_F']=['-c'] | |
17 | v['D_TGT_F']='-o%s' | |
18 | v['D_LINKER']=v['D'] | |
19 | v['DLNK_SRC_F']='' | |
20 | v['DLNK_TGT_F']='-o%s' | |
21 | v['DINC_ST']='-I%s' | |
22 | v['DSHLIB_MARKER']=v['DSTLIB_MARKER']='' | |
23 | v['DSTLIB_ST']=v['DSHLIB_ST']='-l%s' | |
24 | v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L%s' | |
25 | v['LINKFLAGS_dshlib']=['-shared'] | |
26 | v['DHEADER_ext']='.di' | |
15 | v.DFLAGS=[] | |
16 | v.D_SRC_F=['-c'] | |
17 | v.D_TGT_F='-o%s' | |
18 | v.D_LINKER=v.D | |
19 | v.DLNK_SRC_F='' | |
20 | v.DLNK_TGT_F='-o%s' | |
21 | v.DINC_ST='-I%s' | |
22 | v.DSHLIB_MARKER=v.DSTLIB_MARKER='' | |
23 | v.DSTLIB_ST=v.DSHLIB_ST='-l%s' | |
24 | v.DSTLIBPATH_ST=v.DLIBPATH_ST='-L%s' | |
25 | v.LINKFLAGS_dshlib=['-shared'] | |
26 | v.DHEADER_ext='.di' | |
27 | 27 | v.DFLAGS_d_with_header='-fintfc' |
28 | v['D_HDR_F']='-fintfc-file=%s' | |
28 | v.D_HDR_F='-fintfc-file=%s' | |
29 | 29 | def configure(conf): |
30 | 30 | conf.find_gdc() |
31 | 31 | conf.load('ar') |
13 | 13 | @conf |
14 | 14 | def gfortran_flags(conf): |
15 | 15 | v=conf.env |
16 | v['FCFLAGS_fcshlib']=['-fPIC'] | |
17 | v['FORTRANMODFLAG']=['-J',''] | |
18 | v['FCFLAGS_DEBUG']=['-Werror'] | |
16 | v.FCFLAGS_fcshlib=['-fPIC'] | |
17 | v.FORTRANMODFLAG=['-J',''] | |
18 | v.FCFLAGS_DEBUG=['-Werror'] | |
19 | 19 | @conf |
20 | 20 | def gfortran_modifier_win32(conf): |
21 | 21 | fc_config.fortran_modifier_win32(conf) |
27 | 27 | fc_config.fortran_modifier_darwin(conf) |
28 | 28 | @conf |
29 | 29 | def gfortran_modifier_platform(conf): |
30 | dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform() | |
30 | dest_os=conf.env.DEST_OS or Utils.unversioned_sys_platform() | |
31 | 31 | gfortran_modifier_func=getattr(conf,'gfortran_modifier_'+dest_os,None) |
32 | 32 | if gfortran_modifier_func: |
33 | 33 | gfortran_modifier_func() |
57 | 57 | return var in k |
58 | 58 | def isT(var): |
59 | 59 | return var in k and k[var]!='0' |
60 | conf.env['FC_VERSION']=(k['__GNUC__'],k['__GNUC_MINOR__'],k['__GNUC_PATCHLEVEL__']) | |
60 | conf.env.FC_VERSION=(k['__GNUC__'],k['__GNUC_MINOR__'],k['__GNUC_PATCHLEVEL__']) | |
61 | 61 | def configure(conf): |
62 | 62 | conf.find_gfortran() |
63 | 63 | conf.find_ar() |
24 | 24 | self.source=self.to_nodes(getattr(self,'source',[])) |
25 | 25 | self.source.append(c_node) |
26 | 26 | class glib_genmarshal(Task.Task): |
27 | vars=['GLIB_GENMARSHAL_PREFIX','GLIB_GENMARSHAL'] | |
28 | color='BLUE' | |
29 | ext_out=['.h'] | |
27 | 30 | def run(self): |
28 | bld=self.inputs[0].__class__.ctx | |
31 | bld=self.generator.bld | |
29 | 32 | get=self.env.get_flat |
30 | 33 | cmd1="%s %s --prefix=%s --header > %s"%(get('GLIB_GENMARSHAL'),self.inputs[0].srcpath(),get('GLIB_GENMARSHAL_PREFIX'),self.outputs[0].abspath()) |
31 | 34 | ret=bld.exec_command(cmd1) |
34 | 37 | self.outputs[1].write(c) |
35 | 38 | cmd2="%s %s --prefix=%s --body >> %s"%(get('GLIB_GENMARSHAL'),self.inputs[0].srcpath(),get('GLIB_GENMARSHAL_PREFIX'),self.outputs[1].abspath()) |
36 | 39 | return bld.exec_command(cmd2) |
37 | vars=['GLIB_GENMARSHAL_PREFIX','GLIB_GENMARSHAL'] | |
38 | color='BLUE' | |
39 | ext_out=['.h'] | |
40 | 40 | @taskgen_method |
41 | 41 | def add_enums_from_template(self,source='',target='',template='',comments=''): |
42 | 42 | if not hasattr(self,'enums_list'): |
60 | 60 | raise Errors.WafError('missing source '+str(enum)) |
61 | 61 | source_list=[self.path.find_resource(k)for k in source_list] |
62 | 62 | inputs+=source_list |
63 | env['GLIB_MKENUMS_SOURCE']=[k.abspath()for k in source_list] | |
63 | env.GLIB_MKENUMS_SOURCE=[k.abspath()for k in source_list] | |
64 | 64 | if not enum['target']: |
65 | 65 | raise Errors.WafError('missing target '+str(enum)) |
66 | 66 | tgt_node=self.path.find_or_declare(enum['target']) |
67 | 67 | if tgt_node.name.endswith('.c'): |
68 | 68 | self.source.append(tgt_node) |
69 | env['GLIB_MKENUMS_TARGET']=tgt_node.abspath() | |
69 | env.GLIB_MKENUMS_TARGET=tgt_node.abspath() | |
70 | 70 | options=[] |
71 | 71 | if enum['template']: |
72 | 72 | template_node=self.path.find_resource(enum['template']) |
76 | 76 | for param,option in params.items(): |
77 | 77 | if enum[param]: |
78 | 78 | options.append('%s %r'%(option,enum[param])) |
79 | env['GLIB_MKENUMS_OPTIONS']=' '.join(options) | |
79 | env.GLIB_MKENUMS_OPTIONS=' '.join(options) | |
80 | 80 | task.set_inputs(inputs) |
81 | 81 | task.set_outputs(tgt_node) |
82 | 82 | class glib_mkenums(Task.Task): |
93 | 93 | @taskgen_method |
94 | 94 | def add_settings_enums(self,namespace,filename_list): |
95 | 95 | if hasattr(self,'settings_enum_namespace'): |
96 | raise Errors.WafError("Tried to add gsettings enums to '%s' more than once"%self.name) | |
96 | raise Errors.WafError("Tried to add gsettings enums to %r more than once"%self.name) | |
97 | 97 | self.settings_enum_namespace=namespace |
98 | 98 | if type(filename_list)!='list': |
99 | 99 | filename_list=[filename_list] |
103 | 103 | enums_tgt_node=[] |
104 | 104 | install_files=[] |
105 | 105 | settings_schema_files=getattr(self,'settings_schema_files',[]) |
106 | if settings_schema_files and not self.env['GLIB_COMPILE_SCHEMAS']: | |
106 | if settings_schema_files and not self.env.GLIB_COMPILE_SCHEMAS: | |
107 | 107 | raise Errors.WafError("Unable to process GSettings schemas - glib-compile-schemas was not found during configure") |
108 | 108 | if hasattr(self,'settings_enum_files'): |
109 | 109 | enums_task=self.create_task('glib_mkenums') |
110 | 110 | source_list=self.settings_enum_files |
111 | 111 | source_list=[self.path.find_resource(k)for k in source_list] |
112 | 112 | enums_task.set_inputs(source_list) |
113 | enums_task.env['GLIB_MKENUMS_SOURCE']=[k.abspath()for k in source_list] | |
113 | enums_task.env.GLIB_MKENUMS_SOURCE=[k.abspath()for k in source_list] | |
114 | 114 | target=self.settings_enum_namespace+'.enums.xml' |
115 | 115 | tgt_node=self.path.find_or_declare(target) |
116 | 116 | enums_task.set_outputs(tgt_node) |
117 | enums_task.env['GLIB_MKENUMS_TARGET']=tgt_node.abspath() | |
117 | enums_task.env.GLIB_MKENUMS_TARGET=tgt_node.abspath() | |
118 | 118 | enums_tgt_node=[tgt_node] |
119 | 119 | install_files.append(tgt_node) |
120 | 120 | options='--comments "<!-- @comment@ -->" --fhead "<schemalist>" --vhead " <@type@ id=\\"%s.@EnumName@\\">" --vprod " <value nick=\\"@valuenick@\\" value=\\"@valuenum@\\"/>" --vtail " </@type@>" --ftail "</schemalist>" '%(self.settings_enum_namespace) |
121 | enums_task.env['GLIB_MKENUMS_OPTIONS']=options | |
121 | enums_task.env.GLIB_MKENUMS_OPTIONS=options | |
122 | 122 | for schema in settings_schema_files: |
123 | 123 | schema_task=self.create_task('glib_validate_schema') |
124 | 124 | schema_node=self.path.find_resource(schema) |
125 | 125 | if not schema_node: |
126 | raise Errors.WafError("Cannot find the schema file '%s'"%schema) | |
126 | raise Errors.WafError("Cannot find the schema file %r"%schema) | |
127 | 127 | install_files.append(schema_node) |
128 | 128 | source_list=enums_tgt_node+[schema_node] |
129 | 129 | schema_task.set_inputs(source_list) |
130 | schema_task.env['GLIB_COMPILE_SCHEMAS_OPTIONS']=[("--schema-file="+k.abspath())for k in source_list] | |
130 | schema_task.env.GLIB_COMPILE_SCHEMAS_OPTIONS=[("--schema-file="+k.abspath())for k in source_list] | |
131 | 131 | target_node=schema_node.change_ext('.xml.valid') |
132 | 132 | schema_task.set_outputs(target_node) |
133 | schema_task.env['GLIB_VALIDATE_SCHEMA_OUTPUT']=target_node.abspath() | |
133 | schema_task.env.GLIB_VALIDATE_SCHEMA_OUTPUT=target_node.abspath() | |
134 | 134 | def compile_schemas_callback(bld): |
135 | 135 | if not bld.is_install:return |
136 | 136 | Logs.pprint('YELLOW','Updating GSettings schema cache') |
137 | 137 | command=Utils.subst_vars("${GLIB_COMPILE_SCHEMAS} ${GSETTINGSSCHEMADIR}",bld.env) |
138 | 138 | self.bld.exec_command(command) |
139 | 139 | if self.bld.is_install: |
140 | if not self.env['GSETTINGSSCHEMADIR']: | |
140 | if not self.env.GSETTINGSSCHEMADIR: | |
141 | 141 | raise Errors.WafError('GSETTINGSSCHEMADIR not defined (should have been set up automatically during configure)') |
142 | 142 | if install_files: |
143 | self.bld.install_files(self.env['GSETTINGSSCHEMADIR'],install_files) | |
143 | self.add_install_files(install_to=self.env.GSETTINGSSCHEMADIR,install_from=install_files) | |
144 | 144 | if not hasattr(self.bld,'_compile_schemas_registered'): |
145 | 145 | self.bld.add_post_fun(compile_schemas_callback) |
146 | 146 | self.bld._compile_schemas_registered=True |
149 | 149 | color='PINK' |
150 | 150 | @extension('.gresource.xml') |
151 | 151 | def process_gresource_source(self,node): |
152 | if not self.env['GLIB_COMPILE_RESOURCES']: | |
152 | if not self.env.GLIB_COMPILE_RESOURCES: | |
153 | 153 | raise Errors.WafError("Unable to process GResource file - glib-compile-resources was not found during configure") |
154 | 154 | if'gresource'in self.features: |
155 | 155 | return |
164 | 164 | task=self.create_task('glib_gresource_bundle',node,node.change_ext('')) |
165 | 165 | inst_to=getattr(self,'install_path',None) |
166 | 166 | if inst_to: |
167 | self.bld.install_files(inst_to,task.outputs) | |
167 | self.add_install_files(install_to=inst_to,install_from=task.outputs) | |
168 | 168 | class glib_gresource_base(Task.Task): |
169 | 169 | color='BLUE' |
170 | 170 | base_cmd='${GLIB_COMPILE_RESOURCES} --sourcedir=${SRC[0].parent.srcpath()} --sourcedir=${SRC[0].bld_dir()}' |
171 | 171 | def scan(self): |
172 | 172 | bld=self.generator.bld |
173 | 173 | kw={} |
174 | try: | |
175 | if not kw.get('cwd',None): | |
176 | kw['cwd']=bld.cwd | |
177 | except AttributeError: | |
178 | bld.cwd=kw['cwd']=bld.variant_dir | |
174 | kw['cwd']=self.get_cwd() | |
179 | 175 | kw['quiet']=Context.BOTH |
180 | 176 | cmd=Utils.subst_vars('${GLIB_COMPILE_RESOURCES} --sourcedir=%s --sourcedir=%s --generate-dependencies %s'%(self.inputs[0].parent.srcpath(),self.inputs[0].bld_dir(),self.inputs[0].bldpath()),self.env) |
181 | 177 | output=bld.cmd_and_log(cmd,**kw) |
216 | 212 | if not gsettingsschemadir: |
217 | 213 | datadir=getstr('DATADIR') |
218 | 214 | if not datadir: |
219 | prefix=conf.env['PREFIX'] | |
215 | prefix=conf.env.PREFIX | |
220 | 216 | datadir=os.path.join(prefix,'share') |
221 | 217 | gsettingsschemadir=os.path.join(datadir,'glib-2.0','schemas') |
222 | conf.env['GSETTINGSSCHEMADIR']=gsettingsschemadir | |
218 | conf.env.GSETTINGSSCHEMADIR=gsettingsschemadir | |
223 | 219 | @conf |
224 | 220 | def find_glib_compile_resources(conf): |
225 | 221 | conf.find_program('glib-compile-resources',var='GLIB_COMPILE_RESOURCES') |
11 | 11 | @conf |
12 | 12 | def gxx_common_flags(conf): |
13 | 13 | v=conf.env |
14 | v['CXX_SRC_F']=[] | |
15 | v['CXX_TGT_F']=['-c','-o'] | |
16 | if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] | |
17 | v['CXXLNK_SRC_F']=[] | |
18 | v['CXXLNK_TGT_F']=['-o'] | |
19 | v['CPPPATH_ST']='-I%s' | |
20 | v['DEFINES_ST']='-D%s' | |
21 | v['LIB_ST']='-l%s' | |
22 | v['LIBPATH_ST']='-L%s' | |
23 | v['STLIB_ST']='-l%s' | |
24 | v['STLIBPATH_ST']='-L%s' | |
25 | v['RPATH_ST']='-Wl,-rpath,%s' | |
26 | v['SONAME_ST']='-Wl,-h,%s' | |
27 | v['SHLIB_MARKER']='-Wl,-Bdynamic' | |
28 | v['STLIB_MARKER']='-Wl,-Bstatic' | |
29 | v['cxxprogram_PATTERN']='%s' | |
30 | v['CXXFLAGS_cxxshlib']=['-fPIC'] | |
31 | v['LINKFLAGS_cxxshlib']=['-shared'] | |
32 | v['cxxshlib_PATTERN']='lib%s.so' | |
33 | v['LINKFLAGS_cxxstlib']=['-Wl,-Bstatic'] | |
34 | v['cxxstlib_PATTERN']='lib%s.a' | |
35 | v['LINKFLAGS_MACBUNDLE']=['-bundle','-undefined','dynamic_lookup'] | |
36 | v['CXXFLAGS_MACBUNDLE']=['-fPIC'] | |
37 | v['macbundle_PATTERN']='%s.bundle' | |
14 | v.CXX_SRC_F=[] | |
15 | v.CXX_TGT_F=['-c','-o'] | |
16 | if not v.LINK_CXX: | |
17 | v.LINK_CXX=v.CXX | |
18 | v.CXXLNK_SRC_F=[] | |
19 | v.CXXLNK_TGT_F=['-o'] | |
20 | v.CPPPATH_ST='-I%s' | |
21 | v.DEFINES_ST='-D%s' | |
22 | v.LIB_ST='-l%s' | |
23 | v.LIBPATH_ST='-L%s' | |
24 | v.STLIB_ST='-l%s' | |
25 | v.STLIBPATH_ST='-L%s' | |
26 | v.RPATH_ST='-Wl,-rpath,%s' | |
27 | v.SONAME_ST='-Wl,-h,%s' | |
28 | v.SHLIB_MARKER='-Wl,-Bdynamic' | |
29 | v.STLIB_MARKER='-Wl,-Bstatic' | |
30 | v.cxxprogram_PATTERN='%s' | |
31 | v.CXXFLAGS_cxxshlib=['-fPIC'] | |
32 | v.LINKFLAGS_cxxshlib=['-shared'] | |
33 | v.cxxshlib_PATTERN='lib%s.so' | |
34 | v.LINKFLAGS_cxxstlib=['-Wl,-Bstatic'] | |
35 | v.cxxstlib_PATTERN='lib%s.a' | |
36 | v.LINKFLAGS_MACBUNDLE=['-bundle','-undefined','dynamic_lookup'] | |
37 | v.CXXFLAGS_MACBUNDLE=['-fPIC'] | |
38 | v.macbundle_PATTERN='%s.bundle' | |
38 | 39 | @conf |
39 | 40 | def gxx_modifier_win32(conf): |
40 | 41 | v=conf.env |
41 | v['cxxprogram_PATTERN']='%s.exe' | |
42 | v['cxxshlib_PATTERN']='%s.dll' | |
43 | v['implib_PATTERN']='lib%s.dll.a' | |
44 | v['IMPLIB_ST']='-Wl,--out-implib,%s' | |
45 | v['CXXFLAGS_cxxshlib']=[] | |
42 | v.cxxprogram_PATTERN='%s.exe' | |
43 | v.cxxshlib_PATTERN='%s.dll' | |
44 | v.implib_PATTERN='lib%s.dll.a' | |
45 | v.IMPLIB_ST='-Wl,--out-implib,%s' | |
46 | v.CXXFLAGS_cxxshlib=[] | |
46 | 47 | v.append_value('LINKFLAGS',['-Wl,--enable-auto-import']) |
47 | 48 | @conf |
48 | 49 | def gxx_modifier_cygwin(conf): |
49 | 50 | gxx_modifier_win32(conf) |
50 | 51 | v=conf.env |
51 | v['cxxshlib_PATTERN']='cyg%s.dll' | |
52 | v.cxxshlib_PATTERN='cyg%s.dll' | |
52 | 53 | v.append_value('LINKFLAGS_cxxshlib',['-Wl,--enable-auto-image-base']) |
53 | v['CXXFLAGS_cxxshlib']=[] | |
54 | v.CXXFLAGS_cxxshlib=[] | |
54 | 55 | @conf |
55 | 56 | def gxx_modifier_darwin(conf): |
56 | 57 | v=conf.env |
57 | v['CXXFLAGS_cxxshlib']=['-fPIC'] | |
58 | v['LINKFLAGS_cxxshlib']=['-dynamiclib'] | |
59 | v['cxxshlib_PATTERN']='lib%s.dylib' | |
60 | v['FRAMEWORKPATH_ST']='-F%s' | |
61 | v['FRAMEWORK_ST']=['-framework'] | |
62 | v['ARCH_ST']=['-arch'] | |
63 | v['LINKFLAGS_cxxstlib']=[] | |
64 | v['SHLIB_MARKER']=[] | |
65 | v['STLIB_MARKER']=[] | |
66 | v['SONAME_ST']=[] | |
58 | v.CXXFLAGS_cxxshlib=['-fPIC'] | |
59 | v.LINKFLAGS_cxxshlib=['-dynamiclib'] | |
60 | v.cxxshlib_PATTERN='lib%s.dylib' | |
61 | v.FRAMEWORKPATH_ST='-F%s' | |
62 | v.FRAMEWORK_ST=['-framework'] | |
63 | v.ARCH_ST=['-arch'] | |
64 | v.LINKFLAGS_cxxstlib=[] | |
65 | v.SHLIB_MARKER=[] | |
66 | v.STLIB_MARKER=[] | |
67 | v.SONAME_ST=[] | |
67 | 68 | @conf |
68 | 69 | def gxx_modifier_aix(conf): |
69 | 70 | v=conf.env |
70 | v['LINKFLAGS_cxxprogram']=['-Wl,-brtl'] | |
71 | v['LINKFLAGS_cxxshlib']=['-shared','-Wl,-brtl,-bexpfull'] | |
72 | v['SHLIB_MARKER']=[] | |
71 | v.LINKFLAGS_cxxprogram=['-Wl,-brtl'] | |
72 | v.LINKFLAGS_cxxshlib=['-shared','-Wl,-brtl,-bexpfull'] | |
73 | v.SHLIB_MARKER=[] | |
73 | 74 | @conf |
74 | 75 | def gxx_modifier_hpux(conf): |
75 | 76 | v=conf.env |
76 | v['SHLIB_MARKER']=[] | |
77 | v['STLIB_MARKER']=[] | |
78 | v['CFLAGS_cxxshlib']=['-fPIC','-DPIC'] | |
79 | v['cxxshlib_PATTERN']='lib%s.sl' | |
77 | v.SHLIB_MARKER=[] | |
78 | v.STLIB_MARKER=[] | |
79 | v.CFLAGS_cxxshlib=['-fPIC','-DPIC'] | |
80 | v.cxxshlib_PATTERN='lib%s.sl' | |
80 | 81 | @conf |
81 | 82 | def gxx_modifier_openbsd(conf): |
82 | 83 | conf.env.SONAME_ST=[] |
83 | 84 | @conf |
84 | 85 | def gcc_modifier_osf1V(conf): |
85 | 86 | v=conf.env |
86 | v['SHLIB_MARKER']=[] | |
87 | v['STLIB_MARKER']=[] | |
88 | v['SONAME_ST']=[] | |
87 | v.SHLIB_MARKER=[] | |
88 | v.STLIB_MARKER=[] | |
89 | v.SONAME_ST=[] | |
89 | 90 | @conf |
90 | 91 | def gxx_modifier_platform(conf): |
91 | 92 | gxx_modifier_func=getattr(conf,'gxx_modifier_'+conf.env.DEST_OS,None) |
6 | 6 | from waflib.Configure import conf |
7 | 7 | @conf |
8 | 8 | def find_icc(conf): |
9 | if sys.platform=='cygwin': | |
10 | conf.fatal('The Intel compiler does not work on Cygwin') | |
11 | 9 | cc=conf.find_program(['icc','ICL'],var='CC') |
12 | 10 | conf.get_cc_version(cc,icc=True) |
13 | 11 | conf.env.CC_NAME='icc' |
6 | 6 | from waflib.Configure import conf |
7 | 7 | @conf |
8 | 8 | def find_icpc(conf): |
9 | if sys.platform=='cygwin': | |
10 | conf.fatal('The Intel compiler does not work on Cygwin') | |
11 | 9 | cxx=conf.find_program('icpc',var='CXX') |
12 | 10 | conf.get_cc_version(cxx,icc=True) |
13 | 11 | conf.env.CXX_NAME='icc' |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import re | |
5 | from waflib import Utils | |
6 | from waflib.Tools import fc,fc_config,fc_scan,ar | |
4 | import os,re | |
5 | from waflib import Utils,Logs,Errors | |
6 | from waflib.Tools import fc,fc_config,fc_scan,ar,ccroot | |
7 | 7 | from waflib.Configure import conf |
8 | from waflib.TaskGen import after_method,feature | |
8 | 9 | @conf |
9 | 10 | def find_ifort(conf): |
10 | 11 | fc=conf.find_program('ifort',var='FC') |
53 | 54 | if not match: |
54 | 55 | conf.fatal('cannot determine ifort version.') |
55 | 56 | k=match.groupdict() |
56 | conf.env['FC_VERSION']=(k['major'],k['minor']) | |
57 | conf.env.FC_VERSION=(k['major'],k['minor']) | |
57 | 58 | def configure(conf): |
58 | 59 | if Utils.is_win32: |
59 | 60 | compiler,version,path,includes,libdirs,arch=conf.detect_ifort(True) |
77 | 78 | conf.fc_flags() |
78 | 79 | conf.fc_add_flags() |
79 | 80 | conf.ifort_modifier_platform() |
80 | import os,sys,re,tempfile | |
81 | from waflib import Task,Logs,Options,Errors | |
82 | from waflib.Logs import debug,warn | |
83 | from waflib.TaskGen import after_method,feature | |
84 | from waflib.Configure import conf | |
85 | from waflib.Tools import ccroot,ar,winres | |
86 | 81 | all_ifort_platforms=[('intel64','amd64'),('em64t','amd64'),('ia32','x86'),('Itanium','ia64')] |
87 | 82 | @conf |
88 | 83 | def gather_ifort_versions(conf,versions): |
100 | 95 | version=Utils.winreg.EnumKey(all_versions,index) |
101 | 96 | except WindowsError: |
102 | 97 | break |
103 | index=index+1 | |
98 | index+=1 | |
104 | 99 | if not version_pattern.match(version): |
105 | 100 | continue |
106 | targets=[] | |
101 | targets={} | |
107 | 102 | for target,arch in all_ifort_platforms: |
103 | if target=='intel64':targetDir='EM64T_NATIVE' | |
104 | else:targetDir=target | |
108 | 105 | try: |
109 | if target=='intel64':targetDir='EM64T_NATIVE' | |
110 | else:targetDir=target | |
111 | 106 | Utils.winreg.OpenKey(all_versions,version+'\\'+targetDir) |
112 | 107 | icl_version=Utils.winreg.OpenKey(all_versions,version) |
113 | 108 | path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') |
109 | except WindowsError: | |
110 | pass | |
111 | else: | |
114 | 112 | batch_file=os.path.join(path,'bin','iclvars.bat') |
115 | 113 | if os.path.isfile(batch_file): |
116 | try: | |
117 | targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) | |
118 | except conf.errors.ConfigurationError: | |
119 | pass | |
120 | except WindowsError: | |
121 | pass | |
114 | targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) | |
122 | 115 | for target,arch in all_ifort_platforms: |
123 | 116 | try: |
124 | 117 | icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+target) |
125 | 118 | path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') |
119 | except WindowsError: | |
120 | continue | |
121 | else: | |
126 | 122 | batch_file=os.path.join(path,'bin','iclvars.bat') |
127 | 123 | if os.path.isfile(batch_file): |
128 | try: | |
129 | targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) | |
130 | except conf.errors.ConfigurationError: | |
131 | pass | |
132 | except WindowsError: | |
133 | continue | |
124 | targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) | |
134 | 125 | major=version[0:2] |
135 | versions.append(('intel '+major,targets)) | |
136 | def setup_ifort(conf,versions,arch=False): | |
137 | platforms=Utils.to_list(conf.env['MSVC_TARGETS'])or[i for i,j in all_ifort_platforms] | |
138 | desired_versions=conf.env['MSVC_VERSIONS']or[v for v,_ in versions][::-1] | |
139 | versiondict=dict(versions) | |
126 | versions['intel '+major]=targets | |
127 | @conf | |
128 | def setup_ifort(conf,versiondict): | |
129 | platforms=Utils.to_list(conf.env.MSVC_TARGETS)or[i for i,j in all_ifort_platforms] | |
130 | desired_versions=conf.env.MSVC_VERSIONS or list(reversed(list(versiondict.keys()))) | |
140 | 131 | for version in desired_versions: |
141 | 132 | try: |
142 | targets=dict(versiondict[version]) | |
143 | for target in platforms: | |
144 | try: | |
145 | try: | |
146 | realtarget,(p1,p2,p3)=targets[target] | |
147 | except conf.errors.ConfigurationError: | |
148 | del(targets[target]) | |
149 | else: | |
150 | compiler,revision=version.rsplit(' ',1) | |
151 | if arch: | |
152 | return compiler,revision,p1,p2,p3,realtarget | |
153 | else: | |
154 | return compiler,revision,p1,p2,p3 | |
155 | except KeyError: | |
156 | continue | |
133 | targets=versiondict[version] | |
157 | 134 | except KeyError: |
158 | 135 | continue |
159 | conf.fatal('msvc: Impossible to find a valid architecture for building (in setup_ifort)') | |
136 | for arch in platforms: | |
137 | try: | |
138 | cfg=targets[arch] | |
139 | except KeyError: | |
140 | continue | |
141 | cfg.evaluate() | |
142 | if cfg.is_valid: | |
143 | compiler,revision=version.rsplit(' ',1) | |
144 | return compiler,revision,cfg.bindirs,cfg.incdirs,cfg.libdirs,cfg.cpu | |
145 | conf.fatal('ifort: Impossible to find a valid architecture for building %r - %r'%(desired_versions,list(versiondict.keys()))) | |
160 | 146 | @conf |
161 | 147 | def get_ifort_version_win32(conf,compiler,version,target,vcvars): |
162 | 148 | try: |
187 | 173 | elif line.startswith('LIB='): |
188 | 174 | MSVC_LIBDIR=[i for i in line[4:].split(';')if i] |
189 | 175 | if None in(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR): |
190 | conf.fatal('msvc: Could not find a valid architecture for building (get_ifort_version_win32)') | |
176 | conf.fatal('ifort: Could not find a valid architecture for building (get_ifort_version_win32)') | |
191 | 177 | env=dict(os.environ) |
192 | 178 | env.update(PATH=path) |
193 | 179 | compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler) |
195 | 181 | if'CL'in env: |
196 | 182 | del(env['CL']) |
197 | 183 | try: |
198 | try: | |
199 | conf.cmd_and_log(fc+['/help'],env=env) | |
200 | except UnicodeError: | |
201 | st=Utils.ex_stack() | |
202 | if conf.logger: | |
203 | conf.logger.error(st) | |
204 | conf.fatal('msvc: Unicode error - check the code page?') | |
205 | except Exception as e: | |
206 | debug('msvc: get_ifort_version: %r %r %r -> failure %s'%(compiler,version,target,str(e))) | |
207 | conf.fatal('msvc: cannot run the compiler in get_ifort_version (run with -v to display errors)') | |
208 | else: | |
209 | debug('msvc: get_ifort_version: %r %r %r -> OK',compiler,version,target) | |
184 | conf.cmd_and_log(fc+['/help'],env=env) | |
185 | except UnicodeError: | |
186 | st=Utils.ex_stack() | |
187 | if conf.logger: | |
188 | conf.logger.error(st) | |
189 | conf.fatal('ifort: Unicode error - check the code page?') | |
190 | except Exception as e: | |
191 | Logs.debug('ifort: get_ifort_version: %r %r %r -> failure %s',compiler,version,target,str(e)) | |
192 | conf.fatal('ifort: cannot run the compiler in get_ifort_version (run with -v to display errors)') | |
193 | else: | |
194 | Logs.debug('ifort: get_ifort_version: %r %r %r -> OK',compiler,version,target) | |
210 | 195 | finally: |
211 | 196 | conf.env[compiler_name]='' |
212 | 197 | return(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR) |
213 | def get_compiler_env(conf,compiler,version,bat_target,bat,select=None): | |
214 | lazy=getattr(Options.options,'msvc_lazy',True) | |
215 | if conf.env.MSVC_LAZY_AUTODETECT is False: | |
216 | lazy=False | |
217 | def msvc_thunk(): | |
218 | vs=conf.get_ifort_version_win32(compiler,version,bat_target,bat) | |
219 | if select: | |
220 | return select(vs) | |
221 | else: | |
222 | return vs | |
223 | return lazytup(msvc_thunk,lazy,([],[],[])) | |
224 | class lazytup(object): | |
225 | def __init__(self,fn,lazy=True,default=None): | |
226 | self.fn=fn | |
227 | self.default=default | |
228 | if not lazy: | |
229 | self.evaluate() | |
230 | def __len__(self): | |
231 | self.evaluate() | |
232 | return len(self.value) | |
233 | def __iter__(self): | |
234 | self.evaluate() | |
235 | for i,v in enumerate(self.value): | |
236 | yield v | |
237 | def __getitem__(self,i): | |
238 | self.evaluate() | |
239 | return self.value[i] | |
198 | class target_compiler(object): | |
199 | def __init__(self,ctx,compiler,cpu,version,bat_target,bat,callback=None): | |
200 | self.conf=ctx | |
201 | self.name=None | |
202 | self.is_valid=False | |
203 | self.is_done=False | |
204 | self.compiler=compiler | |
205 | self.cpu=cpu | |
206 | self.version=version | |
207 | self.bat_target=bat_target | |
208 | self.bat=bat | |
209 | self.callback=callback | |
210 | def evaluate(self): | |
211 | if self.is_done: | |
212 | return | |
213 | self.is_done=True | |
214 | try: | |
215 | vs=self.conf.get_msvc_version(self.compiler,self.version,self.bat_target,self.bat) | |
216 | except Errors.ConfigurationError: | |
217 | self.is_valid=False | |
218 | return | |
219 | if self.callback: | |
220 | vs=self.callback(self,vs) | |
221 | self.is_valid=True | |
222 | (self.bindirs,self.incdirs,self.libdirs)=vs | |
223 | def __str__(self): | |
224 | return str((self.bindirs,self.incdirs,self.libdirs)) | |
240 | 225 | def __repr__(self): |
241 | if hasattr(self,'value'): | |
242 | return repr(self.value) | |
243 | elif self.default: | |
244 | return repr(self.default) | |
245 | else: | |
246 | self.evaluate() | |
247 | return repr(self.value) | |
248 | def evaluate(self): | |
249 | if hasattr(self,'value'): | |
250 | return | |
251 | self.value=self.fn() | |
252 | @conf | |
253 | def get_ifort_versions(conf,eval_and_save=True): | |
254 | if conf.env['IFORT_INSTALLED_VERSIONS']: | |
255 | return conf.env['IFORT_INSTALLED_VERSIONS'] | |
256 | lst=[] | |
257 | conf.gather_ifort_versions(lst) | |
258 | if eval_and_save: | |
259 | def checked_target(t): | |
260 | target,(arch,paths)=t | |
261 | try: | |
262 | paths.evaluate() | |
263 | except conf.errors.ConfigurationError: | |
264 | return None | |
265 | else: | |
266 | return t | |
267 | lst=[(version,list(filter(checked_target,targets)))for version,targets in lst] | |
268 | conf.env['IFORT_INSTALLED_VERSIONS']=lst | |
269 | return lst | |
270 | @conf | |
271 | def detect_ifort(conf,arch=False): | |
272 | versions=get_ifort_versions(conf,False) | |
273 | return setup_ifort(conf,versions,arch) | |
274 | def _get_prog_names(conf,compiler): | |
226 | return repr((self.bindirs,self.incdirs,self.libdirs)) | |
227 | @conf | |
228 | def detect_ifort(self): | |
229 | return self.setup_ifort(self.get_ifort_versions(False)) | |
230 | @conf | |
231 | def get_ifort_versions(self,eval_and_save=True): | |
232 | dct={} | |
233 | self.gather_ifort_versions(dct) | |
234 | return dct | |
235 | def _get_prog_names(self,compiler): | |
275 | 236 | if compiler=='intel': |
276 | 237 | compiler_name='ifort' |
277 | 238 | linker_name='XILINK' |
284 | 245 | @conf |
285 | 246 | def find_ifort_win32(conf): |
286 | 247 | v=conf.env |
287 | path=v['PATH'] | |
288 | compiler=v['MSVC_COMPILER'] | |
289 | version=v['MSVC_VERSION'] | |
248 | path=v.PATH | |
249 | compiler=v.MSVC_COMPILER | |
250 | version=v.MSVC_VERSION | |
290 | 251 | compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler) |
291 | 252 | v.IFORT_MANIFEST=(compiler=='intel'and version>=11) |
292 | 253 | fc=conf.find_program(compiler_name,var='FC',path_list=path) |
294 | 255 | if path:env.update(PATH=';'.join(path)) |
295 | 256 | if not conf.cmd_and_log(fc+['/nologo','/help'],env=env): |
296 | 257 | conf.fatal('not intel fortran compiler could not be identified') |
297 | v['FC_NAME']='IFORT' | |
298 | if not v['LINK_FC']: | |
258 | v.FC_NAME='IFORT' | |
259 | if not v.LINK_FC: | |
299 | 260 | conf.find_program(linker_name,var='LINK_FC',path_list=path,mandatory=True) |
300 | if not v['AR']: | |
261 | if not v.AR: | |
301 | 262 | conf.find_program(lib_name,path_list=path,var='AR',mandatory=True) |
302 | v['ARFLAGS']=['/NOLOGO'] | |
263 | v.ARFLAGS=['/nologo'] | |
303 | 264 | if v.IFORT_MANIFEST: |
304 | 265 | conf.find_program('MT',path_list=path,var='MT') |
305 | v['MTFLAGS']=['/NOLOGO'] | |
266 | v.MTFLAGS=['/nologo'] | |
306 | 267 | try: |
307 | 268 | conf.load('winres') |
308 | 269 | except Errors.WafError: |
309 | warn('Resource compiler not found. Compiling resource file is disabled') | |
270 | Logs.warn('Resource compiler not found. Compiling resource file is disabled') | |
310 | 271 | @after_method('apply_link') |
311 | 272 | @feature('fc') |
312 | 273 | def apply_flags_ifort(self): |
325 | 286 | pdbnode=self.link_task.outputs[0].change_ext('.pdb') |
326 | 287 | self.link_task.outputs.append(pdbnode) |
327 | 288 | if getattr(self,'install_task',None): |
328 | self.pdb_install_task=self.bld.install_files(self.install_task.dest,pdbnode,env=self.env) | |
289 | self.pdb_install_task=self.add_install_files(install_to=self.install_task.install_to,install_from=pdbnode) | |
329 | 290 | break |
330 | 291 | @feature('fcprogram','fcshlib','fcprogram_test') |
331 | 292 | @after_method('apply_link') |
336 | 297 | out_node=self.link_task.outputs[0] |
337 | 298 | man_node=out_node.parent.find_or_declare(out_node.name+'.manifest') |
338 | 299 | self.link_task.outputs.append(man_node) |
339 | self.link_task.do_manifest=True | |
340 | def exec_mf(self): | |
341 | env=self.env | |
342 | mtool=env['MT'] | |
343 | if not mtool: | |
344 | return 0 | |
345 | self.do_manifest=False | |
346 | outfile=self.outputs[0].abspath() | |
347 | manifest=None | |
348 | for out_node in self.outputs: | |
349 | if out_node.name.endswith('.manifest'): | |
350 | manifest=out_node.abspath() | |
351 | break | |
352 | if manifest is None: | |
353 | return 0 | |
354 | mode='' | |
355 | if'fcprogram'in self.generator.features or'fcprogram_test'in self.generator.features: | |
356 | mode='1' | |
357 | elif'fcshlib'in self.generator.features: | |
358 | mode='2' | |
359 | debug('msvc: embedding manifest in mode %r'%mode) | |
360 | lst=[]+mtool | |
361 | lst.extend(Utils.to_list(env['MTFLAGS'])) | |
362 | lst.extend(['-manifest',manifest]) | |
363 | lst.append('-outputresource:%s;%s'%(outfile,mode)) | |
364 | return self.exec_command(lst) | |
365 | def quote_response_command(self,flag): | |
366 | if flag.find(' ')>-1: | |
367 | for x in('/LIBPATH:','/IMPLIB:','/OUT:','/I'): | |
368 | if flag.startswith(x): | |
369 | flag='%s"%s"'%(x,flag[len(x):]) | |
370 | break | |
371 | else: | |
372 | flag='"%s"'%flag | |
373 | return flag | |
374 | def exec_response_command(self,cmd,**kw): | |
375 | try: | |
376 | tmp=None | |
377 | if sys.platform.startswith('win')and isinstance(cmd,list)and len(' '.join(cmd))>=8192: | |
378 | program=cmd[0] | |
379 | cmd=[self.quote_response_command(x)for x in cmd] | |
380 | (fd,tmp)=tempfile.mkstemp() | |
381 | os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:]).encode()) | |
382 | os.close(fd) | |
383 | cmd=[program,'@'+tmp] | |
384 | ret=super(self.__class__,self).exec_command(cmd,**kw) | |
385 | finally: | |
386 | if tmp: | |
387 | try: | |
388 | os.remove(tmp) | |
389 | except OSError: | |
390 | pass | |
391 | return ret | |
392 | def exec_command_ifort(self,*k,**kw): | |
393 | if isinstance(k[0],list): | |
394 | lst=[] | |
395 | carry='' | |
396 | for a in k[0]: | |
397 | if a=='/Fo'or a=='/doc'or a[-1]==':': | |
398 | carry=a | |
399 | else: | |
400 | lst.append(carry+a) | |
401 | carry='' | |
402 | k=[lst] | |
403 | if self.env['PATH']: | |
404 | env=dict(self.env.env or os.environ) | |
405 | env.update(PATH=';'.join(self.env['PATH'])) | |
406 | kw['env']=env | |
407 | if not'cwd'in kw: | |
408 | kw['cwd']=self.generator.bld.variant_dir | |
409 | ret=self.exec_response_command(k[0],**kw) | |
410 | if not ret and getattr(self,'do_manifest',None): | |
411 | ret=self.exec_mf() | |
412 | return ret | |
413 | def wrap_class(class_name): | |
414 | cls=Task.classes.get(class_name,None) | |
415 | if not cls: | |
416 | return None | |
417 | derived_class=type(class_name,(cls,),{}) | |
418 | def exec_command(self,*k,**kw): | |
419 | if self.env.IFORT_WIN32: | |
420 | return self.exec_command_ifort(*k,**kw) | |
421 | else: | |
422 | return super(derived_class,self).exec_command(*k,**kw) | |
423 | derived_class.exec_command=exec_command | |
424 | derived_class.exec_response_command=exec_response_command | |
425 | derived_class.quote_response_command=quote_response_command | |
426 | derived_class.exec_command_ifort=exec_command_ifort | |
427 | derived_class.exec_mf=exec_mf | |
428 | if hasattr(cls,'hcode'): | |
429 | derived_class.hcode=cls.hcode | |
430 | return derived_class | |
431 | for k in'fc fcprogram fcprogram_test fcshlib fcstlib'.split(): | |
432 | wrap_class(k) | |
300 | self.env.DO_MANIFEST=True |
44 | 44 | task=self.create_task('intltool',node,node.change_ext('')) |
45 | 45 | inst=getattr(self,'install_path',None) |
46 | 46 | if inst: |
47 | self.bld.install_files(inst,task.outputs) | |
47 | self.add_install_files(install_to=inst,install_from=task.outputs) | |
48 | 48 | @feature('intltool_po') |
49 | 49 | def apply_intltool_po(self): |
50 | 50 | try:self.meths.remove('process_source') |
70 | 70 | filename=task.outputs[0].name |
71 | 71 | (langname,ext)=os.path.splitext(filename) |
72 | 72 | inst_file=inst+os.sep+langname+os.sep+'LC_MESSAGES'+os.sep+appname+'.mo' |
73 | self.bld.install_as(inst_file,task.outputs[0],chmod=getattr(self,'chmod',Utils.O644),env=task.env) | |
73 | self.add_install_as(install_to=inst_file,install_from=task.outputs[0],chmod=getattr(self,'chmod',Utils.O644)) | |
74 | 74 | else: |
75 | 75 | Logs.pprint('RED',"Error no LINGUAS file found in po directory") |
76 | 76 | class po(Task.Task): |
7 | 7 | def find_irixcc(conf): |
8 | 8 | v=conf.env |
9 | 9 | cc=None |
10 | if v['CC']:cc=v['CC'] | |
11 | elif'CC'in conf.environ:cc=conf.environ['CC'] | |
12 | if not cc:cc=conf.find_program('cc',var='CC') | |
13 | if not cc:conf.fatal('irixcc was not found') | |
10 | if v.CC: | |
11 | cc=v.CC | |
12 | elif'CC'in conf.environ: | |
13 | cc=conf.environ['CC'] | |
14 | if not cc: | |
15 | cc=conf.find_program('cc',var='CC') | |
16 | if not cc: | |
17 | conf.fatal('irixcc was not found') | |
14 | 18 | try: |
15 | 19 | conf.cmd_and_log(cc+['-version']) |
16 | 20 | except Exception: |
17 | 21 | conf.fatal('%r -version could not be executed'%cc) |
18 | v['CC']=cc | |
19 | v['CC_NAME']='irix' | |
22 | v.CC=cc | |
23 | v.CC_NAME='irix' | |
20 | 24 | @conf |
21 | 25 | def irixcc_common_flags(conf): |
22 | 26 | v=conf.env |
23 | v['CC_SRC_F']='' | |
24 | v['CC_TGT_F']=['-c','-o'] | |
25 | v['CPPPATH_ST']='-I%s' | |
26 | v['DEFINES_ST']='-D%s' | |
27 | if not v['LINK_CC']:v['LINK_CC']=v['CC'] | |
28 | v['CCLNK_SRC_F']='' | |
29 | v['CCLNK_TGT_F']=['-o'] | |
30 | v['LIB_ST']='-l%s' | |
31 | v['LIBPATH_ST']='-L%s' | |
32 | v['STLIB_ST']='-l%s' | |
33 | v['STLIBPATH_ST']='-L%s' | |
34 | v['cprogram_PATTERN']='%s' | |
35 | v['cshlib_PATTERN']='lib%s.so' | |
36 | v['cstlib_PATTERN']='lib%s.a' | |
27 | v.CC_SRC_F='' | |
28 | v.CC_TGT_F=['-c','-o'] | |
29 | v.CPPPATH_ST='-I%s' | |
30 | v.DEFINES_ST='-D%s' | |
31 | if not v.LINK_CC: | |
32 | v.LINK_CC=v.CC | |
33 | v.CCLNK_SRC_F='' | |
34 | v.CCLNK_TGT_F=['-o'] | |
35 | v.LIB_ST='-l%s' | |
36 | v.LIBPATH_ST='-L%s' | |
37 | v.STLIB_ST='-l%s' | |
38 | v.STLIBPATH_ST='-L%s' | |
39 | v.cprogram_PATTERN='%s' | |
40 | v.cshlib_PATTERN='lib%s.so' | |
41 | v.cstlib_PATTERN='lib%s.a' | |
37 | 42 | def configure(conf): |
38 | 43 | conf.find_irixcc() |
39 | 44 | conf.find_cpp() |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import os,tempfile,shutil | |
5 | from waflib import Task,Utils,Errors,Node,Logs | |
4 | import os,shutil | |
5 | from waflib import Task,Utils,Errors,Node | |
6 | 6 | from waflib.Configure import conf |
7 | 7 | from waflib.TaskGen import feature,before_method,after_method |
8 | 8 | from waflib.Tools import ccroot |
40 | 40 | outdir=self.path.get_bld() |
41 | 41 | outdir.mkdir() |
42 | 42 | self.outdir=outdir |
43 | self.env['OUTDIR']=outdir.abspath() | |
43 | self.env.OUTDIR=outdir.abspath() | |
44 | 44 | self.javac_task=tsk=self.create_task('javac') |
45 | 45 | tmp=[] |
46 | 46 | srcdir=getattr(self,'srcdir','') |
74 | 74 | for x in names: |
75 | 75 | try: |
76 | 76 | y=get(x) |
77 | except Exception: | |
77 | except Errors.WafError: | |
78 | 78 | self.uselib.append(x) |
79 | 79 | else: |
80 | 80 | y.post() |
84 | 84 | else: |
85 | 85 | for tsk in y.tasks: |
86 | 86 | self.javac_task.set_run_after(tsk) |
87 | if lst: | |
88 | self.env.append_value('CLASSPATH',lst) | |
87 | self.env.append_value('CLASSPATH',lst) | |
89 | 88 | @feature('javac') |
90 | 89 | @after_method('apply_java','propagate_uselib_vars','use_javac_files') |
91 | 90 | def set_classpath(self): |
127 | 126 | jaropts.append('-C') |
128 | 127 | jaropts.append(basedir.bldpath()) |
129 | 128 | jaropts.append('.') |
130 | tsk.env['JAROPTS']=jaropts | |
131 | tsk.env['JARCREATE']=jarcreate | |
129 | tsk.env.JAROPTS=jaropts | |
130 | tsk.env.JARCREATE=jarcreate | |
132 | 131 | if getattr(self,'javac_task',None): |
133 | 132 | tsk.set_run_after(self.javac_task) |
134 | 133 | @feature('jar') |
140 | 139 | for x in names: |
141 | 140 | try: |
142 | 141 | y=get(x) |
143 | except Exception: | |
142 | except Errors.WafError: | |
144 | 143 | self.uselib.append(x) |
145 | 144 | else: |
146 | 145 | y.post() |
147 | 146 | self.jar_task.run_after.update(y.tasks) |
148 | class jar_create(Task.Task): | |
147 | class JTask(Task.Task): | |
148 | def split_argfile(self,cmd): | |
149 | inline=[cmd[0]] | |
150 | infile=[] | |
151 | for x in cmd[1:]: | |
152 | if x.startswith('-J'): | |
153 | inline.append(x) | |
154 | else: | |
155 | infile.append(self.quote_flag(x)) | |
156 | return(inline,infile) | |
157 | class jar_create(JTask): | |
149 | 158 | color='GREEN' |
150 | 159 | run_str='${JAR} ${JARCREATE} ${TGT} ${JAROPTS}' |
151 | 160 | def runnable_status(self): |
159 | 168 | except Exception: |
160 | 169 | raise Errors.WafError('Could not find the basedir %r for %r'%(self.basedir,self)) |
161 | 170 | return super(jar_create,self).runnable_status() |
162 | class javac(Task.Task): | |
171 | class javac(JTask): | |
163 | 172 | color='BLUE' |
173 | run_str='${JAVAC} -classpath ${CLASSPATH} -d ${OUTDIR} ${JAVACFLAGS} ${SRC}' | |
164 | 174 | vars=['CLASSPATH','JAVACFLAGS','JAVAC','OUTDIR'] |
165 | 175 | def uid(self): |
166 | 176 | lst=[self.__class__.__name__,self.generator.outdir.abspath()] |
177 | 187 | for x in self.srcdir: |
178 | 188 | self.inputs.extend(x.ant_glob(SOURCE_RE,remove=False)) |
179 | 189 | return super(javac,self).runnable_status() |
180 | def run(self): | |
181 | env=self.env | |
182 | gen=self.generator | |
183 | bld=gen.bld | |
184 | wd=bld.bldnode.abspath() | |
185 | def to_list(xx): | |
186 | if isinstance(xx,str):return[xx] | |
187 | return xx | |
188 | cmd=[] | |
189 | cmd.extend(to_list(env['JAVAC'])) | |
190 | cmd.extend(['-classpath']) | |
191 | cmd.extend(to_list(env['CLASSPATH'])) | |
192 | cmd.extend(['-d']) | |
193 | cmd.extend(to_list(env['OUTDIR'])) | |
194 | cmd.extend(to_list(env['JAVACFLAGS'])) | |
195 | files=[a.path_from(bld.bldnode)for a in self.inputs] | |
196 | tmp=None | |
197 | try: | |
198 | if len(str(files))+len(str(cmd))>8192: | |
199 | (fd,tmp)=tempfile.mkstemp(dir=bld.bldnode.abspath()) | |
200 | try: | |
201 | os.write(fd,'\n'.join(files).encode()) | |
202 | finally: | |
203 | if tmp: | |
204 | os.close(fd) | |
205 | if Logs.verbose: | |
206 | Logs.debug('runner: %r'%(cmd+files)) | |
207 | cmd.append('@'+tmp) | |
208 | else: | |
209 | cmd+=files | |
210 | ret=self.exec_command(cmd,cwd=wd,env=env.env or None) | |
211 | finally: | |
212 | if tmp: | |
213 | os.remove(tmp) | |
214 | return ret | |
215 | 190 | def post_run(self): |
216 | for n in self.generator.outdir.ant_glob('**/*.class'): | |
217 | n.sig=Utils.h_file(n.abspath()) | |
191 | for node in self.generator.outdir.ant_glob('**/*.class'): | |
192 | self.generator.bld.node_sigs[node]=self.uid() | |
218 | 193 | self.generator.bld.task_sigs[self.uid()]=self.cache_sig |
219 | 194 | @feature('javadoc') |
220 | 195 | @after_method('process_rule') |
231 | 206 | def run(self): |
232 | 207 | env=self.env |
233 | 208 | bld=self.generator.bld |
234 | wd=bld.bldnode.abspath() | |
209 | wd=bld.bldnode | |
235 | 210 | srcpath=self.generator.path.abspath()+os.sep+self.generator.srcdir |
236 | 211 | srcpath+=os.pathsep |
237 | 212 | srcpath+=self.generator.path.get_bld().abspath()+os.sep+self.generator.srcdir |
240 | 215 | classpath+=os.pathsep.join(self.classpath) |
241 | 216 | classpath="".join(classpath) |
242 | 217 | self.last_cmd=lst=[] |
243 | lst.extend(Utils.to_list(env['JAVADOC'])) | |
218 | lst.extend(Utils.to_list(env.JAVADOC)) | |
244 | 219 | lst.extend(['-d',self.generator.javadoc_output.abspath()]) |
245 | 220 | lst.extend(['-sourcepath',srcpath]) |
246 | 221 | lst.extend(['-classpath',classpath]) |
250 | 225 | self.generator.bld.cmd_and_log(lst,cwd=wd,env=env.env or None,quiet=0) |
251 | 226 | def post_run(self): |
252 | 227 | nodes=self.generator.javadoc_output.ant_glob('**') |
253 | for x in nodes: | |
254 | x.sig=Utils.h_file(x.abspath()) | |
228 | for node in nodes: | |
229 | self.generator.bld.node_sigs[node]=self.uid() | |
255 | 230 | self.generator.bld.task_sigs[self.uid()]=self.cache_sig |
256 | 231 | def configure(self): |
257 | 232 | java_path=self.environ['PATH'].split(os.pathsep) |
258 | 233 | v=self.env |
259 | 234 | if'JAVA_HOME'in self.environ: |
260 | 235 | java_path=[os.path.join(self.environ['JAVA_HOME'],'bin')]+java_path |
261 | self.env['JAVA_HOME']=[self.environ['JAVA_HOME']] | |
236 | self.env.JAVA_HOME=[self.environ['JAVA_HOME']] | |
262 | 237 | for x in'javac java jar javadoc'.split(): |
263 | 238 | self.find_program(x,var=x.upper(),path_list=java_path) |
264 | 239 | if'CLASSPATH'in self.environ: |
265 | v['CLASSPATH']=self.environ['CLASSPATH'] | |
266 | if not v['JAR']:self.fatal('jar is required for making java packages') | |
267 | if not v['JAVAC']:self.fatal('javac is required for compiling java classes') | |
268 | v['JARCREATE']='cf' | |
269 | v['JAVACFLAGS']=[] | |
240 | v.CLASSPATH=self.environ['CLASSPATH'] | |
241 | if not v.JAR: | |
242 | self.fatal('jar is required for making java packages') | |
243 | if not v.JAVAC: | |
244 | self.fatal('javac is required for compiling java classes') | |
245 | v.JARCREATE='cf' | |
246 | v.JAVACFLAGS=[] | |
270 | 247 | @conf |
271 | 248 | def check_java_class(self,classname,with_classpath=None): |
272 | 249 | javatestdir='.waf-javatest' |
273 | 250 | classpath=javatestdir |
274 | if self.env['CLASSPATH']: | |
275 | classpath+=os.pathsep+self.env['CLASSPATH'] | |
251 | if self.env.CLASSPATH: | |
252 | classpath+=os.pathsep+self.env.CLASSPATH | |
276 | 253 | if isinstance(with_classpath,str): |
277 | 254 | classpath+=os.pathsep+with_classpath |
278 | 255 | shutil.rmtree(javatestdir,True) |
279 | 256 | os.mkdir(javatestdir) |
280 | 257 | Utils.writef(os.path.join(javatestdir,'Test.java'),class_check_source) |
281 | self.exec_command(self.env['JAVAC']+[os.path.join(javatestdir,'Test.java')],shell=False) | |
282 | cmd=self.env['JAVA']+['-cp',classpath,'Test',classname] | |
258 | self.exec_command(self.env.JAVAC+[os.path.join(javatestdir,'Test.java')],shell=False) | |
259 | cmd=self.env.JAVA+['-cp',classpath,'Test',classname] | |
283 | 260 | self.to_log("%s\n"%str(cmd)) |
284 | 261 | found=self.exec_command(cmd,shell=False) |
285 | 262 | self.msg('Checking for java class %s'%classname,not found) |
291 | 268 | conf.fatal('load a compiler first (gcc, g++, ..)') |
292 | 269 | if not conf.env.JAVA_HOME: |
293 | 270 | conf.fatal('set JAVA_HOME in the system environment') |
294 | javaHome=conf.env['JAVA_HOME'][0] | |
271 | javaHome=conf.env.JAVA_HOME[0] | |
295 | 272 | dir=conf.root.find_dir(conf.env.JAVA_HOME[0]+'/include') |
296 | 273 | if dir is None: |
297 | 274 | dir=conf.root.find_dir(conf.env.JAVA_HOME[0]+'/../Headers') |
0 | #! /usr/bin/env python | |
1 | # encoding: utf-8 | |
2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file | |
3 | ||
4 | import os,re | |
5 | from waflib import Task,Utils | |
6 | from waflib.TaskGen import feature | |
7 | @feature('msgfmt') | |
8 | def apply_msgfmt(self): | |
9 | for lang in self.to_list(self.langs): | |
10 | node=self.path.find_resource(lang+'.po') | |
11 | task=self.create_task('msgfmt',node,node.change_ext('.mo')) | |
12 | langname=lang.split('/') | |
13 | langname=langname[-1] | |
14 | inst=getattr(self,'install_path','${KDE4_LOCALE_INSTALL_DIR}') | |
15 | self.bld.install_as(inst+os.sep+langname+os.sep+'LC_MESSAGES'+os.sep+getattr(self,'appname','set_your_appname')+'.mo',task.outputs[0],chmod=getattr(self,'chmod',Utils.O644)) | |
16 | class msgfmt(Task.Task): | |
17 | color='BLUE' | |
18 | run_str='${MSGFMT} ${SRC} -o ${TGT}' | |
19 | def configure(self): | |
20 | kdeconfig=self.find_program('kde4-config') | |
21 | prefix=self.cmd_and_log(kdeconfig+['--prefix']).strip() | |
22 | fname='%s/share/apps/cmake/modules/KDELibsDependencies.cmake'%prefix | |
23 | try:os.stat(fname) | |
24 | except OSError: | |
25 | fname='%s/share/kde4/apps/cmake/modules/KDELibsDependencies.cmake'%prefix | |
26 | try:os.stat(fname) | |
27 | except OSError:self.fatal('could not open %s'%fname) | |
28 | try: | |
29 | txt=Utils.readf(fname) | |
30 | except EnvironmentError: | |
31 | self.fatal('could not read %s'%fname) | |
32 | txt=txt.replace('\\\n','\n') | |
33 | fu=re.compile('#(.*)\n') | |
34 | txt=fu.sub('',txt) | |
35 | setregexp=re.compile('([sS][eE][tT]\s*\()\s*([^\s]+)\s+\"([^"]+)\"\)') | |
36 | found=setregexp.findall(txt) | |
37 | for(_,key,val)in found: | |
38 | self.env[key]=val | |
39 | self.env['LIB_KDECORE']=['kdecore'] | |
40 | self.env['LIB_KDEUI']=['kdeui'] | |
41 | self.env['LIB_KIO']=['kio'] | |
42 | self.env['LIB_KHTML']=['khtml'] | |
43 | self.env['LIB_KPARTS']=['kparts'] | |
44 | self.env['LIBPATH_KDECORE']=[os.path.join(self.env.KDE4_LIB_INSTALL_DIR,'kde4','devel'),self.env.KDE4_LIB_INSTALL_DIR] | |
45 | self.env['INCLUDES_KDECORE']=[self.env['KDE4_INCLUDE_INSTALL_DIR']] | |
46 | self.env.append_value('INCLUDES_KDECORE',[self.env['KDE4_INCLUDE_INSTALL_DIR']+os.sep+'KDE']) | |
47 | self.find_program('msgfmt',var='MSGFMT') |
12 | 12 | @conf |
13 | 13 | def common_flags_ldc2(conf): |
14 | 14 | v=conf.env |
15 | v['D_SRC_F']=['-c'] | |
16 | v['D_TGT_F']='-of%s' | |
17 | v['D_LINKER']=v['D'] | |
18 | v['DLNK_SRC_F']='' | |
19 | v['DLNK_TGT_F']='-of%s' | |
20 | v['DINC_ST']='-I%s' | |
21 | v['DSHLIB_MARKER']=v['DSTLIB_MARKER']='' | |
22 | v['DSTLIB_ST']=v['DSHLIB_ST']='-L-l%s' | |
23 | v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L-L%s' | |
24 | v['LINKFLAGS_dshlib']=['-L-shared'] | |
25 | v['DHEADER_ext']='.di' | |
26 | v['DFLAGS_d_with_header']=['-H','-Hf'] | |
27 | v['D_HDR_F']='%s' | |
28 | v['LINKFLAGS']=[] | |
29 | v['DFLAGS_dshlib']=['-relocation-model=pic'] | |
15 | v.D_SRC_F=['-c'] | |
16 | v.D_TGT_F='-of%s' | |
17 | v.D_LINKER=v.D | |
18 | v.DLNK_SRC_F='' | |
19 | v.DLNK_TGT_F='-of%s' | |
20 | v.DINC_ST='-I%s' | |
21 | v.DSHLIB_MARKER=v.DSTLIB_MARKER='' | |
22 | v.DSTLIB_ST=v.DSHLIB_ST='-L-l%s' | |
23 | v.DSTLIBPATH_ST=v.DLIBPATH_ST='-L-L%s' | |
24 | v.LINKFLAGS_dshlib=['-L-shared'] | |
25 | v.DHEADER_ext='.di' | |
26 | v.DFLAGS_d_with_header=['-H','-Hf'] | |
27 | v.D_HDR_F='%s' | |
28 | v.LINKFLAGS=[] | |
29 | v.DFLAGS_dshlib=['-relocation-model=pic'] | |
30 | 30 | def configure(conf): |
31 | 31 | conf.find_ldc2() |
32 | 32 | conf.load('ar') |
8 | 8 | tsk=self.create_task('luac',node,node.change_ext('.luac')) |
9 | 9 | inst_to=getattr(self,'install_path',self.env.LUADIR and'${LUADIR}'or None) |
10 | 10 | if inst_to: |
11 | self.bld.install_files(inst_to,tsk.outputs) | |
11 | self.add_install_files(install_to=inst_to,install_from=tsk.outputs) | |
12 | 12 | return tsk |
13 | 13 | class luac(Task.Task): |
14 | 14 | run_str='${LUAC} -s -o ${TGT} ${SRC}' |
0 | #! /usr/bin/env python | |
1 | # encoding: utf-8 | |
2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file | |
3 | ||
4 | import os,stat | |
5 | from waflib import Utils,Build,Node | |
6 | STRONGEST=True | |
7 | Build.SAVED_ATTRS.append('hashes_md5_tstamp') | |
8 | def h_file(self): | |
9 | filename=self.abspath() | |
10 | st=os.stat(filename) | |
11 | cache=self.ctx.hashes_md5_tstamp | |
12 | if filename in cache and cache[filename][0]==st.st_mtime: | |
13 | return cache[filename][1] | |
14 | global STRONGEST | |
15 | if STRONGEST: | |
16 | ret=Utils.h_file(filename) | |
17 | else: | |
18 | if stat.S_ISDIR(st[stat.ST_MODE]): | |
19 | raise IOError('Not a file') | |
20 | ret=Utils.md5(str((st.st_mtime,st.st_size)).encode()).digest() | |
21 | cache[filename]=(st.st_mtime,ret) | |
22 | return ret | |
23 | h_file.__doc__=Node.Node.h_file.__doc__ | |
24 | Node.Node.h_file=h_file |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import os,sys,re,tempfile | |
5 | from waflib import Utils,Task,Logs,Options,Errors | |
6 | from waflib.Logs import debug,warn | |
4 | import os,sys,re | |
5 | from waflib import Utils,Logs,Options,Errors | |
7 | 6 | from waflib.TaskGen import after_method,feature |
8 | 7 | from waflib.Configure import conf |
9 | from waflib.Tools import ccroot,c,cxx,ar,winres | |
8 | from waflib.Tools import ccroot,c,cxx,ar | |
10 | 9 | g_msvc_systemlibs=''' |
11 | 10 | aclui activeds ad1 adptif adsiid advapi32 asycfilt authz bhsupp bits bufferoverflowu cabinet |
12 | 11 | cap certadm certidl ciuuid clusapi comctl32 comdlg32 comsupp comsuppd comsuppw comsuppwd comsvcs |
33 | 32 | def options(opt): |
34 | 33 | opt.add_option('--msvc_version',type='string',help='msvc version, eg: "msvc 10.0,msvc 9.0"',default='') |
35 | 34 | opt.add_option('--msvc_targets',type='string',help='msvc targets, eg: "x64,arm"',default='') |
36 | opt.add_option('--msvc_lazy_autodetect',action='store_true',help='lazily check msvc target environments') | |
37 | def setup_msvc(conf,versions,arch=False): | |
35 | opt.add_option('--no-msvc-lazy',action='store_false',help='lazily check msvc target environments',default=True,dest='msvc_lazy') | |
36 | @conf | |
37 | def setup_msvc(conf,versiondict): | |
38 | 38 | platforms=getattr(Options.options,'msvc_targets','').split(',') |
39 | 39 | if platforms==['']: |
40 | platforms=Utils.to_list(conf.env['MSVC_TARGETS'])or[i for i,j in all_msvc_platforms+all_icl_platforms+all_wince_platforms] | |
40 | platforms=Utils.to_list(conf.env.MSVC_TARGETS)or[i for i,j in all_msvc_platforms+all_icl_platforms+all_wince_platforms] | |
41 | 41 | desired_versions=getattr(Options.options,'msvc_version','').split(',') |
42 | 42 | if desired_versions==['']: |
43 | desired_versions=conf.env['MSVC_VERSIONS']or[v for v,_ in versions][::-1] | |
44 | versiondict=dict(versions) | |
43 | desired_versions=conf.env.MSVC_VERSIONS or list(reversed(list(versiondict.keys()))) | |
44 | lazy_detect=getattr(Options.options,'msvc_lazy',True) | |
45 | if conf.env.MSVC_LAZY_AUTODETECT is False: | |
46 | lazy_detect=False | |
47 | if not lazy_detect: | |
48 | for val in versiondict.values(): | |
49 | for arch in list(val.keys()): | |
50 | cfg=val[arch] | |
51 | cfg.evaluate() | |
52 | if not cfg.is_valid: | |
53 | del val[arch] | |
54 | conf.env.MSVC_INSTALLED_VERSIONS=versiondict | |
45 | 55 | for version in desired_versions: |
46 | 56 | try: |
47 | targets=dict(versiondict[version]) | |
48 | for target in platforms: | |
49 | try: | |
50 | try: | |
51 | realtarget,(p1,p2,p3)=targets[target] | |
52 | except conf.errors.ConfigurationError: | |
53 | del(targets[target]) | |
54 | else: | |
55 | compiler,revision=version.rsplit(' ',1) | |
56 | if arch: | |
57 | return compiler,revision,p1,p2,p3,realtarget | |
58 | else: | |
59 | return compiler,revision,p1,p2,p3 | |
60 | except KeyError:continue | |
61 | except KeyError:continue | |
62 | conf.fatal('msvc: Impossible to find a valid architecture for building (in setup_msvc)') | |
57 | targets=versiondict[version] | |
58 | except KeyError: | |
59 | continue | |
60 | for arch in platforms: | |
61 | try: | |
62 | cfg=targets[arch] | |
63 | except KeyError: | |
64 | continue | |
65 | cfg.evaluate() | |
66 | if cfg.is_valid: | |
67 | compiler,revision=version.rsplit(' ',1) | |
68 | return compiler,revision,cfg.bindirs,cfg.incdirs,cfg.libdirs,cfg.cpu | |
69 | conf.fatal('msvc: Impossible to find a valid architecture for building %r - %r'%(desired_versions,list(versiondict.keys()))) | |
63 | 70 | @conf |
64 | 71 | def get_msvc_version(conf,compiler,version,target,vcvars): |
65 | debug('msvc: get_msvc_version: %r %r %r',compiler,version,target) | |
72 | Logs.debug('msvc: get_msvc_version: %r %r %r',compiler,version,target) | |
66 | 73 | try: |
67 | 74 | conf.msvc_cnt+=1 |
68 | 75 | except AttributeError: |
98 | 105 | if'CL'in env: |
99 | 106 | del(env['CL']) |
100 | 107 | try: |
101 | try: | |
102 | conf.cmd_and_log(cxx+['/help'],env=env) | |
103 | except UnicodeError: | |
104 | st=Utils.ex_stack() | |
105 | if conf.logger: | |
106 | conf.logger.error(st) | |
107 | conf.fatal('msvc: Unicode error - check the code page?') | |
108 | except Exception as e: | |
109 | debug('msvc: get_msvc_version: %r %r %r -> failure %s'%(compiler,version,target,str(e))) | |
110 | conf.fatal('msvc: cannot run the compiler in get_msvc_version (run with -v to display errors)') | |
111 | else: | |
112 | debug('msvc: get_msvc_version: %r %r %r -> OK',compiler,version,target) | |
108 | conf.cmd_and_log(cxx+['/help'],env=env) | |
109 | except UnicodeError: | |
110 | st=Utils.ex_stack() | |
111 | if conf.logger: | |
112 | conf.logger.error(st) | |
113 | conf.fatal('msvc: Unicode error - check the code page?') | |
114 | except Exception as e: | |
115 | Logs.debug('msvc: get_msvc_version: %r %r %r -> failure %s',compiler,version,target,str(e)) | |
116 | conf.fatal('msvc: cannot run the compiler in get_msvc_version (run with -v to display errors)') | |
117 | else: | |
118 | Logs.debug('msvc: get_msvc_version: %r %r %r -> OK',compiler,version,target) | |
113 | 119 | finally: |
114 | 120 | conf.env[compiler_name]='' |
115 | 121 | return(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR) |
129 | 135 | version=Utils.winreg.EnumKey(all_versions,index) |
130 | 136 | except WindowsError: |
131 | 137 | break |
132 | index=index+1 | |
138 | index+=1 | |
133 | 139 | if not version_pattern.match(version): |
134 | 140 | continue |
135 | 141 | try: |
138 | 144 | except WindowsError: |
139 | 145 | continue |
140 | 146 | if path and os.path.isfile(os.path.join(path,'bin','SetEnv.cmd')): |
141 | targets=[] | |
147 | targets={} | |
142 | 148 | for target,arch in all_msvc_platforms: |
143 | try: | |
144 | targets.append((target,(arch,get_compiler_env(conf,'wsdk',version,'/'+target,os.path.join(path,'bin','SetEnv.cmd'))))) | |
145 | except conf.errors.ConfigurationError: | |
146 | pass | |
147 | versions.append(('wsdk '+version[1:],targets)) | |
149 | targets[target]=target_compiler(conf,'wsdk',arch,version,'/'+target,os.path.join(path,'bin','SetEnv.cmd')) | |
150 | versions['wsdk '+version[1:]]=targets | |
148 | 151 | def gather_wince_supported_platforms(): |
149 | 152 | supported_wince_platforms=[] |
150 | 153 | try: |
156 | 159 | ce_sdk='' |
157 | 160 | if not ce_sdk: |
158 | 161 | return supported_wince_platforms |
159 | ce_index=0 | |
162 | index=0 | |
160 | 163 | while 1: |
161 | 164 | try: |
162 | sdk_device=Utils.winreg.EnumKey(ce_sdk,ce_index) | |
165 | sdk_device=Utils.winreg.EnumKey(ce_sdk,index) | |
166 | sdk=Utils.winreg.OpenKey(ce_sdk,sdk_device) | |
163 | 167 | except WindowsError: |
164 | 168 | break |
165 | ce_index=ce_index+1 | |
166 | sdk=Utils.winreg.OpenKey(ce_sdk,sdk_device) | |
169 | index+=1 | |
167 | 170 | try: |
168 | 171 | path,type=Utils.winreg.QueryValueEx(sdk,'SDKRootDir') |
169 | 172 | except WindowsError: |
170 | 173 | try: |
171 | 174 | path,type=Utils.winreg.QueryValueEx(sdk,'SDKInformation') |
172 | path,xml=os.path.split(path) | |
173 | 175 | except WindowsError: |
174 | 176 | continue |
177 | path,xml=os.path.split(path) | |
175 | 178 | path=str(path) |
176 | 179 | path,device=os.path.split(path) |
177 | 180 | if not device: |
187 | 190 | version_pattern=re.compile('^(\d\d?\.\d\d?)(Exp)?$') |
188 | 191 | detected_versions=[] |
189 | 192 | for vcver,vcvar in(('VCExpress','Exp'),('VisualStudio','')): |
190 | try: | |
191 | prefix='SOFTWARE\\Wow6432node\\Microsoft\\'+vcver | |
193 | prefix='SOFTWARE\\Wow6432node\\Microsoft\\'+vcver | |
194 | try: | |
192 | 195 | all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,prefix) |
193 | 196 | except WindowsError: |
194 | try: | |
195 | prefix='SOFTWARE\\Microsoft\\'+vcver | |
197 | prefix='SOFTWARE\\Microsoft\\'+vcver | |
198 | try: | |
196 | 199 | all_versions=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,prefix) |
197 | 200 | except WindowsError: |
198 | 201 | continue |
202 | 205 | version=Utils.winreg.EnumKey(all_versions,index) |
203 | 206 | except WindowsError: |
204 | 207 | break |
205 | index=index+1 | |
208 | index+=1 | |
206 | 209 | match=version_pattern.match(version) |
207 | if not match: | |
210 | if match: | |
211 | versionnumber=float(match.group(1)) | |
212 | else: | |
208 | 213 | continue |
209 | else: | |
210 | versionnumber=float(match.group(1)) | |
211 | detected_versions.append((versionnumber,version+vcvar,prefix+"\\"+version)) | |
214 | detected_versions.append((versionnumber,version+vcvar,prefix+'\\'+version)) | |
212 | 215 | def fun(tup): |
213 | 216 | return tup[0] |
214 | 217 | detected_versions.sort(key=fun) |
215 | 218 | return detected_versions |
216 | def get_compiler_env(conf,compiler,version,bat_target,bat,select=None): | |
217 | lazy=getattr(Options.options,'msvc_lazy_autodetect',False)or conf.env['MSVC_LAZY_AUTODETECT'] | |
218 | def msvc_thunk(): | |
219 | vs=conf.get_msvc_version(compiler,version,bat_target,bat) | |
220 | if select: | |
221 | return select(vs) | |
222 | else: | |
223 | return vs | |
224 | return lazytup(msvc_thunk,lazy,([],[],[])) | |
225 | class lazytup(object): | |
226 | def __init__(self,fn,lazy=True,default=None): | |
227 | self.fn=fn | |
228 | self.default=default | |
229 | if not lazy: | |
230 | self.evaluate() | |
231 | def __len__(self): | |
232 | self.evaluate() | |
233 | return len(self.value) | |
234 | def __iter__(self): | |
235 | self.evaluate() | |
236 | for i,v in enumerate(self.value): | |
237 | yield v | |
238 | def __getitem__(self,i): | |
239 | self.evaluate() | |
240 | return self.value[i] | |
219 | class target_compiler(object): | |
220 | def __init__(self,ctx,compiler,cpu,version,bat_target,bat,callback=None): | |
221 | self.conf=ctx | |
222 | self.name=None | |
223 | self.is_valid=False | |
224 | self.is_done=False | |
225 | self.compiler=compiler | |
226 | self.cpu=cpu | |
227 | self.version=version | |
228 | self.bat_target=bat_target | |
229 | self.bat=bat | |
230 | self.callback=callback | |
231 | def evaluate(self): | |
232 | if self.is_done: | |
233 | return | |
234 | self.is_done=True | |
235 | try: | |
236 | vs=self.conf.get_msvc_version(self.compiler,self.version,self.bat_target,self.bat) | |
237 | except Errors.ConfigurationError: | |
238 | self.is_valid=False | |
239 | return | |
240 | if self.callback: | |
241 | vs=self.callback(self,vs) | |
242 | self.is_valid=True | |
243 | (self.bindirs,self.incdirs,self.libdirs)=vs | |
244 | def __str__(self): | |
245 | return str((self.bindirs,self.incdirs,self.libdirs)) | |
241 | 246 | def __repr__(self): |
242 | if hasattr(self,'value'): | |
243 | return repr(self.value) | |
244 | elif self.default: | |
245 | return repr(self.default) | |
246 | else: | |
247 | self.evaluate() | |
248 | return repr(self.value) | |
249 | def evaluate(self): | |
250 | if hasattr(self,'value'): | |
251 | return | |
252 | self.value=self.fn() | |
247 | return repr((self.bindirs,self.incdirs,self.libdirs)) | |
253 | 248 | @conf |
254 | 249 | def gather_msvc_targets(conf,versions,version,vc_path): |
255 | targets=[] | |
250 | targets={} | |
256 | 251 | if os.path.isfile(os.path.join(vc_path,'vcvarsall.bat')): |
257 | 252 | for target,realtarget in all_msvc_platforms[::-1]: |
258 | try: | |
259 | targets.append((target,(realtarget,get_compiler_env(conf,'msvc',version,target,os.path.join(vc_path,'vcvarsall.bat'))))) | |
260 | except conf.errors.ConfigurationError: | |
261 | pass | |
253 | targets[target]=target_compiler(conf,'msvc',realtarget,version,target,os.path.join(vc_path,'vcvarsall.bat')) | |
262 | 254 | elif os.path.isfile(os.path.join(vc_path,'Common7','Tools','vsvars32.bat')): |
263 | try: | |
264 | targets.append(('x86',('x86',get_compiler_env(conf,'msvc',version,'x86',os.path.join(vc_path,'Common7','Tools','vsvars32.bat'))))) | |
265 | except conf.errors.ConfigurationError: | |
266 | pass | |
255 | targets['x86']=target_compiler(conf,'msvc','x86',version,'x86',os.path.join(vc_path,'Common7','Tools','vsvars32.bat')) | |
267 | 256 | elif os.path.isfile(os.path.join(vc_path,'Bin','vcvars32.bat')): |
268 | try: | |
269 | targets.append(('x86',('x86',get_compiler_env(conf,'msvc',version,'',os.path.join(vc_path,'Bin','vcvars32.bat'))))) | |
270 | except conf.errors.ConfigurationError: | |
271 | pass | |
257 | targets['x86']=target_compiler(conf,'msvc','x86',version,'',os.path.join(vc_path,'Bin','vcvars32.bat')) | |
272 | 258 | if targets: |
273 | versions.append(('msvc '+version,targets)) | |
259 | versions['msvc '+version]=targets | |
274 | 260 | @conf |
275 | 261 | def gather_wince_targets(conf,versions,version,vc_path,vsvars,supported_platforms): |
276 | 262 | for device,platforms in supported_platforms: |
277 | cetargets=[] | |
263 | targets={} | |
278 | 264 | for platform,compiler,include,lib in platforms: |
279 | 265 | winCEpath=os.path.join(vc_path,'ce') |
280 | 266 | if not os.path.isdir(winCEpath): |
283 | 269 | bindirs=[os.path.join(winCEpath,'bin',compiler),os.path.join(winCEpath,'bin','x86_'+compiler)] |
284 | 270 | incdirs=[os.path.join(winCEpath,'include'),os.path.join(winCEpath,'atlmfc','include'),include] |
285 | 271 | libdirs=[os.path.join(winCEpath,'lib',platform),os.path.join(winCEpath,'atlmfc','lib',platform),lib] |
286 | def combine_common(compiler_env): | |
272 | def combine_common(obj,compiler_env): | |
287 | 273 | (common_bindirs,_1,_2)=compiler_env |
288 | 274 | return(bindirs+common_bindirs,incdirs,libdirs) |
289 | try: | |
290 | cetargets.append((platform,(platform,get_compiler_env(conf,'msvc',version,'x86',vsvars,combine_common)))) | |
291 | except conf.errors.ConfigurationError: | |
292 | continue | |
293 | if cetargets: | |
294 | versions.append((device+' '+version,cetargets)) | |
275 | targets[platform]=target_compiler(conf,'msvc',platform,version,'x86',vsvars,combine_common) | |
276 | if targets: | |
277 | versions[device+' '+version]=targets | |
295 | 278 | @conf |
296 | 279 | def gather_winphone_targets(conf,versions,version,vc_path,vsvars): |
297 | targets=[] | |
280 | targets={} | |
298 | 281 | for target,realtarget in all_msvc_platforms[::-1]: |
299 | try: | |
300 | targets.append((target,(realtarget,get_compiler_env(conf,'winphone',version,target,vsvars)))) | |
301 | except conf.errors.ConfigurationError: | |
302 | pass | |
282 | targets[target]=target_compiler(conf,'winphone',realtarget,version,target,vsvars) | |
303 | 283 | if targets: |
304 | versions.append(('winphone '+version,targets)) | |
284 | versions['winphone '+version]=targets | |
305 | 285 | @conf |
306 | 286 | def gather_msvc_versions(conf,versions): |
307 | 287 | vc_paths=[] |
312 | 292 | except WindowsError: |
313 | 293 | msvc_version=Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE,reg+"\\Setup\\Microsoft Visual C++") |
314 | 294 | path,type=Utils.winreg.QueryValueEx(msvc_version,'ProductDir') |
295 | except WindowsError: | |
296 | continue | |
297 | else: | |
315 | 298 | vc_paths.append((version,os.path.abspath(str(path)))) |
316 | except WindowsError: | |
317 | continue | |
318 | 299 | wince_supported_platforms=gather_wince_supported_platforms() |
319 | 300 | for version,vc_path in vc_paths: |
320 | 301 | vs_path=os.path.dirname(vc_path) |
346 | 327 | version=Utils.winreg.EnumKey(all_versions,index) |
347 | 328 | except WindowsError: |
348 | 329 | break |
349 | index=index+1 | |
330 | index+=1 | |
350 | 331 | if not version_pattern.match(version): |
351 | 332 | continue |
352 | targets=[] | |
333 | targets={} | |
353 | 334 | for target,arch in all_icl_platforms: |
354 | try: | |
355 | if target=='intel64':targetDir='EM64T_NATIVE' | |
356 | else:targetDir=target | |
335 | if target=='intel64':targetDir='EM64T_NATIVE' | |
336 | else:targetDir=target | |
337 | try: | |
357 | 338 | Utils.winreg.OpenKey(all_versions,version+'\\'+targetDir) |
358 | 339 | icl_version=Utils.winreg.OpenKey(all_versions,version) |
359 | 340 | path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') |
341 | except WindowsError: | |
342 | pass | |
343 | else: | |
360 | 344 | batch_file=os.path.join(path,'bin','iclvars.bat') |
361 | 345 | if os.path.isfile(batch_file): |
362 | try: | |
363 | targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) | |
364 | except conf.errors.ConfigurationError: | |
365 | pass | |
366 | except WindowsError: | |
367 | pass | |
346 | targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) | |
368 | 347 | for target,arch in all_icl_platforms: |
369 | 348 | try: |
370 | 349 | icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+target) |
371 | 350 | path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') |
351 | except WindowsError: | |
352 | continue | |
353 | else: | |
372 | 354 | batch_file=os.path.join(path,'bin','iclvars.bat') |
373 | 355 | if os.path.isfile(batch_file): |
374 | try: | |
375 | targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) | |
376 | except conf.errors.ConfigurationError: | |
377 | pass | |
378 | except WindowsError: | |
379 | continue | |
356 | targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) | |
380 | 357 | major=version[0:2] |
381 | versions.append(('intel '+major,targets)) | |
358 | versions['intel '+major]=targets | |
382 | 359 | @conf |
383 | 360 | def gather_intel_composer_versions(conf,versions): |
384 | 361 | version_pattern=re.compile('^...?.?\...?.?.?') |
395 | 372 | version=Utils.winreg.EnumKey(all_versions,index) |
396 | 373 | except WindowsError: |
397 | 374 | break |
398 | index=index+1 | |
375 | index+=1 | |
399 | 376 | if not version_pattern.match(version): |
400 | 377 | continue |
401 | targets=[] | |
378 | targets={} | |
402 | 379 | for target,arch in all_icl_platforms: |
403 | try: | |
404 | if target=='intel64':targetDir='EM64T_NATIVE' | |
405 | else:targetDir=target | |
380 | if target=='intel64':targetDir='EM64T_NATIVE' | |
381 | else:targetDir=target | |
382 | try: | |
406 | 383 | try: |
407 | 384 | defaults=Utils.winreg.OpenKey(all_versions,version+'\\Defaults\\C++\\'+targetDir) |
408 | 385 | except WindowsError: |
409 | 386 | if targetDir=='EM64T_NATIVE': |
410 | 387 | defaults=Utils.winreg.OpenKey(all_versions,version+'\\Defaults\\C++\\EM64T') |
411 | 388 | else: |
412 | raise WindowsError | |
389 | raise | |
413 | 390 | uid,type=Utils.winreg.QueryValueEx(defaults,'SubKey') |
414 | 391 | Utils.winreg.OpenKey(all_versions,version+'\\'+uid+'\\C++\\'+targetDir) |
415 | 392 | icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+uid+'\\C++') |
416 | 393 | path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir') |
394 | except WindowsError: | |
395 | pass | |
396 | else: | |
417 | 397 | batch_file=os.path.join(path,'bin','iclvars.bat') |
418 | 398 | if os.path.isfile(batch_file): |
419 | try: | |
420 | targets.append((target,(arch,get_compiler_env(conf,'intel',version,target,batch_file)))) | |
421 | except conf.errors.ConfigurationError: | |
422 | pass | |
399 | targets[target]=target_compiler(conf,'intel',arch,version,target,batch_file) | |
423 | 400 | compilervars_warning_attr='_compilervars_warning_key' |
424 | 401 | if version[0:2]=='13'and getattr(conf,compilervars_warning_attr,True): |
425 | 402 | setattr(conf,compilervars_warning_attr,False) |
431 | 408 | dev_env_path=os.environ[vscomntool]+r'..\IDE\devenv.exe' |
432 | 409 | if(r'if exist "%VS110COMNTOOLS%..\IDE\VSWinExpress.exe"'in Utils.readf(compilervars_arch)and not os.path.exists(vs_express_path)and not os.path.exists(dev_env_path)): |
433 | 410 | Logs.warn(('The Intel compilervar_arch.bat only checks for one Visual Studio SKU ''(VSWinExpress.exe) but it does not seem to be installed at %r. ''The intel command line set up will fail to configure unless the file %r''is patched. See: %s')%(vs_express_path,compilervars_arch,patch_url)) |
434 | except WindowsError: | |
435 | pass | |
436 | 411 | major=version[0:2] |
437 | versions.append(('intel '+major,targets)) | |
438 | @conf | |
439 | def get_msvc_versions(conf,eval_and_save=True): | |
440 | if conf.env['MSVC_INSTALLED_VERSIONS']: | |
441 | return conf.env['MSVC_INSTALLED_VERSIONS'] | |
442 | lst=[] | |
443 | conf.gather_icl_versions(lst) | |
444 | conf.gather_intel_composer_versions(lst) | |
445 | conf.gather_wsdk_versions(lst) | |
446 | conf.gather_msvc_versions(lst) | |
447 | if eval_and_save: | |
448 | def checked_target(t): | |
449 | target,(arch,paths)=t | |
450 | try: | |
451 | paths.evaluate() | |
452 | except conf.errors.ConfigurationError: | |
453 | return None | |
454 | else: | |
455 | return t | |
456 | lst=[(version,list(filter(checked_target,targets)))for version,targets in lst] | |
457 | conf.env['MSVC_INSTALLED_VERSIONS']=lst | |
458 | return lst | |
459 | @conf | |
460 | def print_all_msvc_detected(conf): | |
461 | for version,targets in conf.env['MSVC_INSTALLED_VERSIONS']: | |
462 | Logs.info(version) | |
463 | for target,l in targets: | |
464 | Logs.info("\t"+target) | |
465 | @conf | |
466 | def detect_msvc(conf,arch=False): | |
467 | lazy_detect=getattr(Options.options,'msvc_lazy_autodetect',False)or conf.env['MSVC_LAZY_AUTODETECT'] | |
468 | versions=get_msvc_versions(conf,not lazy_detect) | |
469 | return setup_msvc(conf,versions,arch) | |
412 | versions['intel '+major]=targets | |
413 | @conf | |
414 | def detect_msvc(self): | |
415 | return self.setup_msvc(self.get_msvc_versions()) | |
416 | @conf | |
417 | def get_msvc_versions(self): | |
418 | dct={} | |
419 | self.gather_icl_versions(dct) | |
420 | self.gather_intel_composer_versions(dct) | |
421 | self.gather_wsdk_versions(dct) | |
422 | self.gather_msvc_versions(dct) | |
423 | return dct | |
470 | 424 | @conf |
471 | 425 | def find_lt_names_msvc(self,libname,is_static=False): |
472 | 426 | lt_names=['lib%s.la'%libname,'%s.la'%libname,] |
473 | for path in self.env['LIBPATH']: | |
427 | for path in self.env.LIBPATH: | |
474 | 428 | for la in lt_names: |
475 | 429 | laf=os.path.join(path,la) |
476 | 430 | dll=None |
509 | 463 | if lt_static==True: |
510 | 464 | return os.path.join(lt_path,lt_libname) |
511 | 465 | if lt_path!=None: |
512 | _libpaths=[lt_path]+self.env['LIBPATH'] | |
466 | _libpaths=[lt_path]+self.env.LIBPATH | |
513 | 467 | else: |
514 | _libpaths=self.env['LIBPATH'] | |
468 | _libpaths=self.env.LIBPATH | |
515 | 469 | static_libs=['lib%ss.lib'%lib,'lib%s.lib'%lib,'%ss.lib'%lib,'%s.lib'%lib,] |
516 | 470 | dynamic_libs=['lib%s.dll.lib'%lib,'lib%s.dll.a'%lib,'%s.dll.lib'%lib,'%s.dll.a'%lib,'lib%s_d.lib'%lib,'%s_d.lib'%lib,'%s.lib'%lib,] |
517 | 471 | libnames=static_libs |
520 | 474 | for path in _libpaths: |
521 | 475 | for libn in libnames: |
522 | 476 | if os.path.exists(os.path.join(path,libn)): |
523 | debug('msvc: lib found: %s'%os.path.join(path,libn)) | |
477 | Logs.debug('msvc: lib found: %s',os.path.join(path,libn)) | |
524 | 478 | return re.sub('\.lib$','',libn) |
525 | self.fatal("The library %r could not be found"%libname) | |
479 | self.fatal('The library %r could not be found'%libname) | |
526 | 480 | return re.sub('\.lib$','',libname) |
527 | 481 | @conf |
528 | 482 | def check_lib_msvc(self,libname,is_static=False,uselib_store=None): |
556 | 510 | v=conf.env |
557 | 511 | if v.NO_MSVC_DETECT: |
558 | 512 | return |
513 | compiler,version,path,includes,libdirs,cpu=conf.detect_msvc() | |
559 | 514 | if arch: |
560 | compiler,version,path,includes,libdirs,arch=conf.detect_msvc(True) | |
561 | v['DEST_CPU']=arch | |
562 | else: | |
563 | compiler,version,path,includes,libdirs=conf.detect_msvc() | |
564 | v['PATH']=path | |
565 | v['INCLUDES']=includes | |
566 | v['LIBPATH']=libdirs | |
567 | v['MSVC_COMPILER']=compiler | |
568 | try: | |
569 | v['MSVC_VERSION']=float(version) | |
570 | except Exception: | |
571 | v['MSVC_VERSION']=float(version[:-3]) | |
515 | v.DEST_CPU=cpu | |
516 | v.PATH=path | |
517 | v.INCLUDES=includes | |
518 | v.LIBPATH=libdirs | |
519 | v.MSVC_COMPILER=compiler | |
520 | try: | |
521 | v.MSVC_VERSION=float(version) | |
522 | except TypeError: | |
523 | v.MSVC_VERSION=float(version[:-3]) | |
572 | 524 | def _get_prog_names(conf,compiler): |
573 | 525 | if compiler=='intel': |
574 | 526 | compiler_name='ICL' |
584 | 536 | if sys.platform=='cygwin': |
585 | 537 | conf.fatal('MSVC module does not work under cygwin Python!') |
586 | 538 | v=conf.env |
587 | path=v['PATH'] | |
588 | compiler=v['MSVC_COMPILER'] | |
589 | version=v['MSVC_VERSION'] | |
539 | path=v.PATH | |
540 | compiler=v.MSVC_COMPILER | |
541 | version=v.MSVC_VERSION | |
590 | 542 | compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler) |
591 | 543 | v.MSVC_MANIFEST=(compiler=='msvc'and version>=8)or(compiler=='wsdk'and version>=6)or(compiler=='intel'and version>=11) |
592 | 544 | cxx=conf.find_program(compiler_name,var='CXX',path_list=path) |
594 | 546 | if path:env.update(PATH=';'.join(path)) |
595 | 547 | if not conf.cmd_and_log(cxx+['/nologo','/help'],env=env): |
596 | 548 | conf.fatal('the msvc compiler could not be identified') |
597 | v['CC']=v['CXX']=cxx | |
598 | v['CC_NAME']=v['CXX_NAME']='msvc' | |
599 | if not v['LINK_CXX']: | |
600 | link=conf.find_program(linker_name,path_list=path) | |
601 | if link:v['LINK_CXX']=link | |
602 | else:conf.fatal('%s was not found (linker)'%linker_name) | |
603 | v['LINK']=link | |
604 | if not v['LINK_CC']: | |
605 | v['LINK_CC']=v['LINK_CXX'] | |
606 | if not v['AR']: | |
549 | v.CC=v.CXX=cxx | |
550 | v.CC_NAME=v.CXX_NAME='msvc' | |
551 | if not v.LINK_CXX: | |
552 | v.LINK_CXX=conf.find_program(linker_name,path_list=path,errmsg='%s was not found (linker)'%linker_name) | |
553 | if not v.LINK_CC: | |
554 | v.LINK_CC=v.LINK_CXX | |
555 | if not v.AR: | |
607 | 556 | stliblink=conf.find_program(lib_name,path_list=path,var='AR') |
608 | if not stliblink:return | |
609 | v['ARFLAGS']=['/NOLOGO'] | |
557 | if not stliblink: | |
558 | return | |
559 | v.ARFLAGS=['/nologo'] | |
610 | 560 | if v.MSVC_MANIFEST: |
611 | 561 | conf.find_program('MT',path_list=path,var='MT') |
612 | v['MTFLAGS']=['/NOLOGO'] | |
562 | v.MTFLAGS=['/nologo'] | |
613 | 563 | try: |
614 | 564 | conf.load('winres') |
615 | except Errors.WafError: | |
616 | warn('Resource compiler not found. Compiling resource file is disabled') | |
565 | except Errors.ConfigurationError: | |
566 | Logs.warn('Resource compiler not found. Compiling resource file is disabled') | |
617 | 567 | @conf |
618 | 568 | def visual_studio_add_flags(self): |
619 | 569 | v=self.env |
620 | try:v.prepend_value('INCLUDES',[x for x in self.environ['INCLUDE'].split(';')if x]) | |
621 | except Exception:pass | |
622 | try:v.prepend_value('LIBPATH',[x for x in self.environ['LIB'].split(';')if x]) | |
623 | except Exception:pass | |
570 | if self.environ.get('INCLUDE'): | |
571 | v.prepend_value('INCLUDES',[x for x in self.environ['INCLUDE'].split(';')if x]) | |
572 | if self.environ.get('LIB'): | |
573 | v.prepend_value('LIBPATH',[x for x in self.environ['LIB'].split(';')if x]) | |
624 | 574 | @conf |
625 | 575 | def msvc_common_flags(conf): |
626 | 576 | v=conf.env |
627 | v['DEST_BINFMT']='pe' | |
577 | v.DEST_BINFMT='pe' | |
628 | 578 | v.append_value('CFLAGS',['/nologo']) |
629 | 579 | v.append_value('CXXFLAGS',['/nologo']) |
630 | v['DEFINES_ST']='/D%s' | |
631 | v['CC_SRC_F']='' | |
632 | v['CC_TGT_F']=['/c','/Fo'] | |
633 | v['CXX_SRC_F']='' | |
634 | v['CXX_TGT_F']=['/c','/Fo'] | |
580 | v.append_value('LINKFLAGS',['/nologo']) | |
581 | v.DEFINES_ST='/D%s' | |
582 | v.CC_SRC_F='' | |
583 | v.CC_TGT_F=['/c','/Fo'] | |
584 | v.CXX_SRC_F='' | |
585 | v.CXX_TGT_F=['/c','/Fo'] | |
635 | 586 | if(v.MSVC_COMPILER=='msvc'and v.MSVC_VERSION>=8)or(v.MSVC_COMPILER=='wsdk'and v.MSVC_VERSION>=6): |
636 | v['CC_TGT_F']=['/FC']+v['CC_TGT_F'] | |
637 | v['CXX_TGT_F']=['/FC']+v['CXX_TGT_F'] | |
638 | v['CPPPATH_ST']='/I%s' | |
639 | v['AR_TGT_F']=v['CCLNK_TGT_F']=v['CXXLNK_TGT_F']='/OUT:' | |
640 | v['CFLAGS_CONSOLE']=v['CXXFLAGS_CONSOLE']=['/SUBSYSTEM:CONSOLE'] | |
641 | v['CFLAGS_NATIVE']=v['CXXFLAGS_NATIVE']=['/SUBSYSTEM:NATIVE'] | |
642 | v['CFLAGS_POSIX']=v['CXXFLAGS_POSIX']=['/SUBSYSTEM:POSIX'] | |
643 | v['CFLAGS_WINDOWS']=v['CXXFLAGS_WINDOWS']=['/SUBSYSTEM:WINDOWS'] | |
644 | v['CFLAGS_WINDOWSCE']=v['CXXFLAGS_WINDOWSCE']=['/SUBSYSTEM:WINDOWSCE'] | |
645 | v['CFLAGS_CRT_MULTITHREADED']=v['CXXFLAGS_CRT_MULTITHREADED']=['/MT'] | |
646 | v['CFLAGS_CRT_MULTITHREADED_DLL']=v['CXXFLAGS_CRT_MULTITHREADED_DLL']=['/MD'] | |
647 | v['CFLAGS_CRT_MULTITHREADED_DBG']=v['CXXFLAGS_CRT_MULTITHREADED_DBG']=['/MTd'] | |
648 | v['CFLAGS_CRT_MULTITHREADED_DLL_DBG']=v['CXXFLAGS_CRT_MULTITHREADED_DLL_DBG']=['/MDd'] | |
649 | v['LIB_ST']='%s.lib' | |
650 | v['LIBPATH_ST']='/LIBPATH:%s' | |
651 | v['STLIB_ST']='%s.lib' | |
652 | v['STLIBPATH_ST']='/LIBPATH:%s' | |
653 | v.append_value('LINKFLAGS',['/NOLOGO']) | |
654 | if v['MSVC_MANIFEST']: | |
587 | v.CC_TGT_F=['/FC']+v.CC_TGT_F | |
588 | v.CXX_TGT_F=['/FC']+v.CXX_TGT_F | |
589 | v.CPPPATH_ST='/I%s' | |
590 | v.AR_TGT_F=v.CCLNK_TGT_F=v.CXXLNK_TGT_F='/OUT:' | |
591 | v.CFLAGS_CONSOLE=v.CXXFLAGS_CONSOLE=['/SUBSYSTEM:CONSOLE'] | |
592 | v.CFLAGS_NATIVE=v.CXXFLAGS_NATIVE=['/SUBSYSTEM:NATIVE'] | |
593 | v.CFLAGS_POSIX=v.CXXFLAGS_POSIX=['/SUBSYSTEM:POSIX'] | |
594 | v.CFLAGS_WINDOWS=v.CXXFLAGS_WINDOWS=['/SUBSYSTEM:WINDOWS'] | |
595 | v.CFLAGS_WINDOWSCE=v.CXXFLAGS_WINDOWSCE=['/SUBSYSTEM:WINDOWSCE'] | |
596 | v.CFLAGS_CRT_MULTITHREADED=v.CXXFLAGS_CRT_MULTITHREADED=['/MT'] | |
597 | v.CFLAGS_CRT_MULTITHREADED_DLL=v.CXXFLAGS_CRT_MULTITHREADED_DLL=['/MD'] | |
598 | v.CFLAGS_CRT_MULTITHREADED_DBG=v.CXXFLAGS_CRT_MULTITHREADED_DBG=['/MTd'] | |
599 | v.CFLAGS_CRT_MULTITHREADED_DLL_DBG=v.CXXFLAGS_CRT_MULTITHREADED_DLL_DBG=['/MDd'] | |
600 | v.LIB_ST='%s.lib' | |
601 | v.LIBPATH_ST='/LIBPATH:%s' | |
602 | v.STLIB_ST='%s.lib' | |
603 | v.STLIBPATH_ST='/LIBPATH:%s' | |
604 | if v.MSVC_MANIFEST: | |
655 | 605 | v.append_value('LINKFLAGS',['/MANIFEST']) |
656 | v['CFLAGS_cshlib']=[] | |
657 | v['CXXFLAGS_cxxshlib']=[] | |
658 | v['LINKFLAGS_cshlib']=v['LINKFLAGS_cxxshlib']=['/DLL'] | |
659 | v['cshlib_PATTERN']=v['cxxshlib_PATTERN']='%s.dll' | |
660 | v['implib_PATTERN']='%s.lib' | |
661 | v['IMPLIB_ST']='/IMPLIB:%s' | |
662 | v['LINKFLAGS_cstlib']=[] | |
663 | v['cstlib_PATTERN']=v['cxxstlib_PATTERN']='%s.lib' | |
664 | v['cprogram_PATTERN']=v['cxxprogram_PATTERN']='%s.exe' | |
606 | v.CFLAGS_cshlib=[] | |
607 | v.CXXFLAGS_cxxshlib=[] | |
608 | v.LINKFLAGS_cshlib=v.LINKFLAGS_cxxshlib=['/DLL'] | |
609 | v.cshlib_PATTERN=v.cxxshlib_PATTERN='%s.dll' | |
610 | v.implib_PATTERN='%s.lib' | |
611 | v.IMPLIB_ST='/IMPLIB:%s' | |
612 | v.LINKFLAGS_cstlib=[] | |
613 | v.cstlib_PATTERN=v.cxxstlib_PATTERN='%s.lib' | |
614 | v.cprogram_PATTERN=v.cxxprogram_PATTERN='%s.exe' | |
665 | 615 | @after_method('apply_link') |
666 | 616 | @feature('c','cxx') |
667 | 617 | def apply_flags_msvc(self): |
680 | 630 | pdbnode=self.link_task.outputs[0].change_ext('.pdb') |
681 | 631 | self.link_task.outputs.append(pdbnode) |
682 | 632 | if getattr(self,'install_task',None): |
683 | self.pdb_install_task=self.bld.install_files(self.install_task.dest,pdbnode,env=self.env) | |
633 | self.pdb_install_task=self.add_install_files(install_to=self.install_task.install_to,install_from=pdbnode) | |
684 | 634 | break |
685 | 635 | @feature('cprogram','cshlib','cxxprogram','cxxshlib') |
686 | 636 | @after_method('apply_link') |
689 | 639 | out_node=self.link_task.outputs[0] |
690 | 640 | man_node=out_node.parent.find_or_declare(out_node.name+'.manifest') |
691 | 641 | self.link_task.outputs.append(man_node) |
692 | self.link_task.do_manifest=True | |
693 | def exec_mf(self): | |
694 | env=self.env | |
695 | mtool=env['MT'] | |
696 | if not mtool: | |
697 | return 0 | |
698 | self.do_manifest=False | |
699 | outfile=self.outputs[0].abspath() | |
700 | manifest=None | |
701 | for out_node in self.outputs: | |
702 | if out_node.name.endswith('.manifest'): | |
703 | manifest=out_node.abspath() | |
704 | break | |
705 | if manifest is None: | |
706 | return 0 | |
707 | mode='' | |
708 | if'cprogram'in self.generator.features or'cxxprogram'in self.generator.features: | |
709 | mode='1' | |
710 | elif'cshlib'in self.generator.features or'cxxshlib'in self.generator.features: | |
711 | mode='2' | |
712 | debug('msvc: embedding manifest in mode %r'%mode) | |
713 | lst=[]+mtool | |
714 | lst.extend(Utils.to_list(env['MTFLAGS'])) | |
715 | lst.extend(['-manifest',manifest]) | |
716 | lst.append('-outputresource:%s;%s'%(outfile,mode)) | |
717 | return self.exec_command(lst) | |
718 | def quote_response_command(self,flag): | |
719 | if flag.find(' ')>-1: | |
720 | for x in('/LIBPATH:','/IMPLIB:','/OUT:','/I'): | |
721 | if flag.startswith(x): | |
722 | flag='%s"%s"'%(x,flag[len(x):]) | |
723 | break | |
724 | else: | |
725 | flag='"%s"'%flag | |
726 | return flag | |
727 | def exec_response_command(self,cmd,**kw): | |
728 | try: | |
729 | tmp=None | |
730 | if sys.platform.startswith('win')and isinstance(cmd,list)and len(' '.join(cmd))>=8192: | |
731 | program=cmd[0] | |
732 | cmd=[self.quote_response_command(x)for x in cmd] | |
733 | (fd,tmp)=tempfile.mkstemp() | |
734 | os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:]).encode()) | |
735 | os.close(fd) | |
736 | cmd=[program,'@'+tmp] | |
737 | ret=self.generator.bld.exec_command(cmd,**kw) | |
738 | finally: | |
739 | if tmp: | |
740 | try: | |
741 | os.remove(tmp) | |
742 | except OSError: | |
743 | pass | |
744 | return ret | |
745 | def exec_command_msvc(self,*k,**kw): | |
746 | if isinstance(k[0],list): | |
747 | lst=[] | |
748 | carry='' | |
749 | for a in k[0]: | |
750 | if a=='/Fo'or a=='/doc'or a[-1]==':': | |
751 | carry=a | |
752 | else: | |
753 | lst.append(carry+a) | |
754 | carry='' | |
755 | k=[lst] | |
756 | if self.env['PATH']: | |
757 | env=dict(self.env.env or os.environ) | |
758 | env.update(PATH=';'.join(self.env['PATH'])) | |
759 | kw['env']=env | |
760 | bld=self.generator.bld | |
761 | try: | |
762 | if not kw.get('cwd',None): | |
763 | kw['cwd']=bld.cwd | |
764 | except AttributeError: | |
765 | bld.cwd=kw['cwd']=bld.variant_dir | |
766 | ret=self.exec_response_command(k[0],**kw) | |
767 | if not ret and getattr(self,'do_manifest',None): | |
768 | ret=self.exec_mf() | |
769 | return ret | |
770 | def wrap_class(class_name): | |
771 | cls=Task.classes.get(class_name,None) | |
772 | if not cls: | |
773 | return None | |
774 | derived_class=type(class_name,(cls,),{}) | |
775 | def exec_command(self,*k,**kw): | |
776 | if self.env['CC_NAME']=='msvc': | |
777 | return self.exec_command_msvc(*k,**kw) | |
778 | else: | |
779 | return super(derived_class,self).exec_command(*k,**kw) | |
780 | derived_class.exec_command=exec_command | |
781 | derived_class.exec_response_command=exec_response_command | |
782 | derived_class.quote_response_command=quote_response_command | |
783 | derived_class.exec_command_msvc=exec_command_msvc | |
784 | derived_class.exec_mf=exec_mf | |
785 | if hasattr(cls,'hcode'): | |
786 | derived_class.hcode=cls.hcode | |
787 | return derived_class | |
788 | for k in'c cxx cprogram cxxprogram cshlib cxxshlib cstlib cxxstlib'.split(): | |
789 | wrap_class(k) | |
642 | self.env.DO_MANIFEST=True | |
790 | 643 | def make_winapp(self,family): |
791 | 644 | append=self.env.append_unique |
792 | 645 | append('DEFINES','WINAPI_FAMILY=%s'%family) |
793 | append('CXXFLAGS','/ZW') | |
794 | append('CXXFLAGS','/TP') | |
646 | append('CXXFLAGS',['/ZW','/TP']) | |
795 | 647 | for lib_path in self.env.LIBPATH: |
796 | 648 | append('CXXFLAGS','/AI%s'%lib_path) |
797 | 649 | @feature('winphoneapp') |
799 | 651 | @after_method('propagate_uselib_vars') |
800 | 652 | def make_winphone_app(self): |
801 | 653 | make_winapp(self,'WINAPI_FAMILY_PHONE_APP') |
802 | conf.env.append_unique('LINKFLAGS','/NODEFAULTLIB:ole32.lib') | |
803 | conf.env.append_unique('LINKFLAGS','PhoneAppModelHost.lib') | |
654 | conf.env.append_unique('LINKFLAGS',['/NODEFAULTLIB:ole32.lib','PhoneAppModelHost.lib']) | |
804 | 655 | @feature('winapp') |
805 | 656 | @after_method('process_use') |
806 | 657 | @after_method('propagate_uselib_vars') |
0 | #! /usr/bin/env python | |
1 | # encoding: utf-8 | |
2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file | |
3 | ||
4 | from waflib import Task | |
5 | def build(bld): | |
6 | def run(self): | |
7 | for x in self.outputs: | |
8 | x.write('') | |
9 | for(name,cls)in Task.classes.items(): | |
10 | cls.run=run |
10 | 10 | def init_perlext(self): |
11 | 11 | self.uselib=self.to_list(getattr(self,'uselib',[])) |
12 | 12 | if not'PERLEXT'in self.uselib:self.uselib.append('PERLEXT') |
13 | self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['perlext_PATTERN'] | |
13 | self.env.cshlib_PATTERN=self.env.cxxshlib_PATTERN=self.env.perlext_PATTERN | |
14 | 14 | @extension('.xs') |
15 | 15 | def xsubpp_file(self,node): |
16 | 16 | outnode=node.change_ext('.c') |
28 | 28 | else: |
29 | 29 | cver='' |
30 | 30 | self.start_msg('Checking for minimum perl version %s'%cver) |
31 | perl=Utils.to_list(getattr(Options.options,'perlbinary',None)) | |
32 | if not perl: | |
33 | perl=self.find_program('perl',var='PERL') | |
34 | if not perl: | |
35 | self.end_msg("Perl not found",color="YELLOW") | |
36 | return False | |
37 | self.env['PERL']=perl | |
38 | version=self.cmd_and_log(self.env.PERL+["-e",'printf \"%vd\", $^V']) | |
31 | perl=self.find_program('perl',var='PERL',value=getattr(Options.options,'perlbinary',None)) | |
32 | version=self.cmd_and_log(perl+["-e",'printf \"%vd\", $^V']) | |
39 | 33 | if not version: |
40 | 34 | res=False |
41 | 35 | version="Unknown" |
43 | 37 | ver=tuple(map(int,version.split("."))) |
44 | 38 | if ver<minver: |
45 | 39 | res=False |
46 | self.end_msg(version,color=res and"GREEN"or"YELLOW") | |
40 | self.end_msg(version,color=res and'GREEN'or'YELLOW') | |
47 | 41 | return res |
48 | 42 | @conf |
49 | 43 | def check_perl_module(self,module): |
74 | 68 | if xsubpp and os.path.isfile(xsubpp[0]): |
75 | 69 | return xsubpp |
76 | 70 | return self.find_program('xsubpp') |
77 | env['LINKFLAGS_PERLEXT']=cfg_lst('$Config{lddlflags}') | |
78 | env['INCLUDES_PERLEXT']=cfg_lst('$Config{archlib}/CORE') | |
79 | env['CFLAGS_PERLEXT']=cfg_lst('$Config{ccflags} $Config{cccdlflags}') | |
80 | env['EXTUTILS_TYPEMAP']=cfg_lst('$Config{privlib}/ExtUtils/typemap') | |
81 | env['XSUBPP']=find_xsubpp() | |
71 | env.LINKFLAGS_PERLEXT=cfg_lst('$Config{lddlflags}') | |
72 | env.INCLUDES_PERLEXT=cfg_lst('$Config{archlib}/CORE') | |
73 | env.CFLAGS_PERLEXT=cfg_lst('$Config{ccflags} $Config{cccdlflags}') | |
74 | env.EXTUTILS_TYPEMAP=cfg_lst('$Config{privlib}/ExtUtils/typemap') | |
75 | env.XSUBPP=find_xsubpp() | |
82 | 76 | if not getattr(Options.options,'perlarchdir',None): |
83 | env['ARCHDIR_PERL']=cfg_str('$Config{sitearch}') | |
77 | env.ARCHDIR_PERL=cfg_str('$Config{sitearch}') | |
84 | 78 | else: |
85 | env['ARCHDIR_PERL']=getattr(Options.options,'perlarchdir') | |
86 | env['perlext_PATTERN']='%s.'+cfg_str('$Config{dlext}') | |
79 | env.ARCHDIR_PERL=getattr(Options.options,'perlarchdir') | |
80 | env.perlext_PATTERN='%s.'+cfg_str('$Config{dlext}') | |
87 | 81 | def options(opt): |
88 | 82 | opt.add_option('--with-perl-binary',type='string',dest='perlbinary',help='Specify alternate perl binary',default=None) |
89 | 83 | opt.add_option('--with-perl-archdir',type='string',dest='perlarchdir',help='Specify directory where to install arch specific files',default=None) |
46 | 46 | assert(getattr(self,'install_path')),'add features="py"' |
47 | 47 | if self.install_path: |
48 | 48 | if self.install_from: |
49 | self.bld.install_files(self.install_path,[node],cwd=self.install_from,relative_trick=True) | |
50 | else: | |
51 | self.bld.install_files(self.install_path,[node],relative_trick=True) | |
49 | self.add_install_files(install_to=self.install_path,install_from=node,cwd=self.install_from,relative_trick=True) | |
50 | else: | |
51 | self.add_install_files(install_to=self.install_path,install_from=node,relative_trick=True) | |
52 | 52 | lst=[] |
53 | 53 | if self.env.PYC: |
54 | 54 | lst.append('pyc') |
71 | 71 | tsk=self.create_task(ext,node,pyobj) |
72 | 72 | tsk.pyd=pyd |
73 | 73 | if self.install_path: |
74 | self.bld.install_files(os.path.dirname(pyd),pyobj,cwd=node.parent.get_bld(),relative_trick=True) | |
74 | self.add_install_files(install_to=os.path.dirname(pyd),install_from=pyobj,cwd=node.parent.get_bld(),relative_trick=True) | |
75 | 75 | class pyc(Task.Task): |
76 | 76 | color='PINK' |
77 | def __str__(self): | |
78 | node=self.outputs[0] | |
79 | return node.path_from(node.ctx.launch_node()) | |
77 | 80 | def run(self): |
78 | 81 | cmd=[Utils.subst_vars('${PYTHON}',self.env),'-c',INST,self.inputs[0].abspath(),self.outputs[0].abspath(),self.pyd] |
79 | 82 | ret=self.generator.bld.exec_command(cmd) |
80 | 83 | return ret |
81 | 84 | class pyo(Task.Task): |
82 | 85 | color='PINK' |
86 | def __str__(self): | |
87 | node=self.outputs[0] | |
88 | return node.path_from(node.ctx.launch_node()) | |
83 | 89 | def run(self): |
84 | 90 | cmd=[Utils.subst_vars('${PYTHON}',self.env),Utils.subst_vars('${PYFLAGS_OPT}',self.env),'-c',INST,self.inputs[0].abspath(),self.outputs[0].abspath(),self.pyd] |
85 | 91 | ret=self.generator.bld.exec_command(cmd) |
161 | 167 | self.env[x]=self.environ[x] |
162 | 168 | xx=self.env.CXX_NAME and'cxx'or'c' |
163 | 169 | if'pyext'in features: |
164 | flags=self.environ.get('PYTHON_PYEXT_LDFLAGS',self.environ.get('PYTHON_LDFLAGS',None)) | |
170 | flags=self.environ.get('PYTHON_PYEXT_LDFLAGS',self.environ.get('PYTHON_LDFLAGS')) | |
165 | 171 | if flags is None: |
166 | 172 | self.fatal('No flags provided through PYTHON_PYEXT_LDFLAGS as required') |
167 | 173 | else: |
168 | 174 | self.parse_flags(flags,'PYEXT') |
169 | 175 | self.test_pyext(xx) |
170 | 176 | if'pyembed'in features: |
171 | flags=self.environ.get('PYTHON_PYEMBED_LDFLAGS',self.environ.get('PYTHON_LDFLAGS',None)) | |
177 | flags=self.environ.get('PYTHON_PYEMBED_LDFLAGS',self.environ.get('PYTHON_LDFLAGS')) | |
172 | 178 | if flags is None: |
173 | 179 | self.fatal('No flags provided through PYTHON_PYEMBED_LDFLAGS as required') |
174 | 180 | else: |
180 | 186 | features=Utils.to_list(features) |
181 | 187 | assert('pyembed'in features)or('pyext'in features),"check_python_headers features must include 'pyembed' and/or 'pyext'" |
182 | 188 | env=conf.env |
183 | if not env['CC_NAME']and not env['CXX_NAME']: | |
189 | if not env.CC_NAME and not env.CXX_NAME: | |
184 | 190 | conf.fatal('load a compiler first (gcc, g++, ..)') |
185 | 191 | if conf.python_cross_compile(features): |
186 | 192 | return |
187 | if not env['PYTHON_VERSION']: | |
193 | if not env.PYTHON_VERSION: | |
188 | 194 | conf.check_python_version() |
189 | 195 | pybin=env.PYTHON |
190 | 196 | if not pybin: |
200 | 206 | x='MACOSX_DEPLOYMENT_TARGET' |
201 | 207 | if dct[x]: |
202 | 208 | env[x]=conf.environ[x]=dct[x] |
203 | env['pyext_PATTERN']='%s'+dct['SO'] | |
204 | num='.'.join(env['PYTHON_VERSION'].split('.')[:2]) | |
209 | env.pyext_PATTERN='%s'+dct['SO'] | |
210 | num='.'.join(env.PYTHON_VERSION.split('.')[:2]) | |
205 | 211 | conf.find_program([''.join(pybin)+'-config','python%s-config'%num,'python-config-%s'%num,'python%sm-config'%num],var='PYTHON_CONFIG',msg="python-config",mandatory=False) |
206 | 212 | if env.PYTHON_CONFIG: |
207 | 213 | all_flags=[['--cflags','--libs','--ldflags']] |
238 | 244 | conf.parse_flags(all_flags,'PYEXT') |
239 | 245 | result=None |
240 | 246 | if not dct["LDVERSION"]: |
241 | dct["LDVERSION"]=env['PYTHON_VERSION'] | |
242 | for name in('python'+dct['LDVERSION'],'python'+env['PYTHON_VERSION']+'m','python'+env['PYTHON_VERSION'].replace('.','')): | |
243 | if not result and env['LIBPATH_PYEMBED']: | |
244 | path=env['LIBPATH_PYEMBED'] | |
247 | dct["LDVERSION"]=env.PYTHON_VERSION | |
248 | for name in('python'+dct['LDVERSION'],'python'+env.PYTHON_VERSION+'m','python'+env.PYTHON_VERSION.replace('.','')): | |
249 | if not result and env.LIBPATH_PYEMBED: | |
250 | path=env.LIBPATH_PYEMBED | |
245 | 251 | conf.to_log("\n\n# Trying default LIBPATH_PYEMBED: %r\n"%path) |
246 | 252 | result=conf.check(lib=name,uselib='PYEMBED',libpath=path,mandatory=False,msg='Checking for library %s in LIBPATH_PYEMBED'%name) |
247 | 253 | if not result and dct['LIBDIR']: |
259 | 265 | if result: |
260 | 266 | break |
261 | 267 | if result: |
262 | env['LIBPATH_PYEMBED']=path | |
268 | env.LIBPATH_PYEMBED=path | |
263 | 269 | env.append_value('LIB_PYEMBED',[name]) |
264 | 270 | else: |
265 | 271 | conf.to_log("\n\n### LIB NOT FOUND\n") |
266 | 272 | if Utils.is_win32 or dct['Py_ENABLE_SHARED']: |
267 | env['LIBPATH_PYEXT']=env['LIBPATH_PYEMBED'] | |
268 | env['LIB_PYEXT']=env['LIB_PYEMBED'] | |
273 | env.LIBPATH_PYEXT=env.LIBPATH_PYEMBED | |
274 | env.LIB_PYEXT=env.LIB_PYEMBED | |
269 | 275 | conf.to_log("Include path for Python extensions (found via distutils module): %r\n"%(dct['INCLUDEPY'],)) |
270 | env['INCLUDES_PYEXT']=[dct['INCLUDEPY']] | |
271 | env['INCLUDES_PYEMBED']=[dct['INCLUDEPY']] | |
272 | if env['CC_NAME']=='gcc': | |
276 | env.INCLUDES_PYEXT=[dct['INCLUDEPY']] | |
277 | env.INCLUDES_PYEMBED=[dct['INCLUDEPY']] | |
278 | if env.CC_NAME=='gcc': | |
273 | 279 | env.append_value('CFLAGS_PYEMBED',['-fno-strict-aliasing']) |
274 | 280 | env.append_value('CFLAGS_PYEXT',['-fno-strict-aliasing']) |
275 | if env['CXX_NAME']=='gcc': | |
281 | if env.CXX_NAME=='gcc': | |
276 | 282 | env.append_value('CXXFLAGS_PYEMBED',['-fno-strict-aliasing']) |
277 | 283 | env.append_value('CXXFLAGS_PYEXT',['-fno-strict-aliasing']) |
278 | 284 | if env.CC_NAME=="msvc": |
286 | 292 | @conf |
287 | 293 | def check_python_version(conf,minver=None): |
288 | 294 | assert minver is None or isinstance(minver,tuple) |
289 | pybin=conf.env['PYTHON'] | |
295 | pybin=conf.env.PYTHON | |
290 | 296 | if not pybin: |
291 | 297 | conf.fatal('could not find the python executable') |
292 | 298 | cmd=pybin+['-c','import sys\nfor x in sys.version_info: print(str(x))'] |
293 | Logs.debug('python: Running python command %r'%cmd) | |
299 | Logs.debug('python: Running python command %r',cmd) | |
294 | 300 | lines=conf.cmd_and_log(cmd).split() |
295 | assert len(lines)==5,"found %i lines, expected 5: %r"%(len(lines),lines) | |
301 | assert len(lines)==5,"found %r lines, expected 5: %r"%(len(lines),lines) | |
296 | 302 | pyver_tuple=(int(lines[0]),int(lines[1]),int(lines[2]),lines[3],int(lines[4])) |
297 | 303 | result=(minver is None)or(pyver_tuple>=minver) |
298 | 304 | if result: |
299 | 305 | pyver='.'.join([str(x)for x in pyver_tuple[:2]]) |
300 | conf.env['PYTHON_VERSION']=pyver | |
306 | conf.env.PYTHON_VERSION=pyver | |
301 | 307 | if'PYTHONDIR'in conf.env: |
302 | pydir=conf.env['PYTHONDIR'] | |
308 | pydir=conf.env.PYTHONDIR | |
303 | 309 | elif'PYTHONDIR'in conf.environ: |
304 | 310 | pydir=conf.environ['PYTHONDIR'] |
305 | 311 | else: |
309 | 315 | python_LIBDEST=None |
310 | 316 | (pydir,)=conf.get_python_variables(["get_python_lib(standard_lib=0, prefix=%r) or ''"%conf.env.PREFIX]) |
311 | 317 | if python_LIBDEST is None: |
312 | if conf.env['LIBDIR']: | |
313 | python_LIBDEST=os.path.join(conf.env['LIBDIR'],"python"+pyver) | |
318 | if conf.env.LIBDIR: | |
319 | python_LIBDEST=os.path.join(conf.env.LIBDIR,'python'+pyver) | |
314 | 320 | else: |
315 | python_LIBDEST=os.path.join(conf.env['PREFIX'],"lib","python"+pyver) | |
321 | python_LIBDEST=os.path.join(conf.env.PREFIX,'lib','python'+pyver) | |
316 | 322 | if'PYTHONARCHDIR'in conf.env: |
317 | pyarchdir=conf.env['PYTHONARCHDIR'] | |
323 | pyarchdir=conf.env.PYTHONARCHDIR | |
318 | 324 | elif'PYTHONARCHDIR'in conf.environ: |
319 | 325 | pyarchdir=conf.environ['PYTHONARCHDIR'] |
320 | 326 | else: |
324 | 330 | if hasattr(conf,'define'): |
325 | 331 | conf.define('PYTHONDIR',pydir) |
326 | 332 | conf.define('PYTHONARCHDIR',pyarchdir) |
327 | conf.env['PYTHONDIR']=pydir | |
328 | conf.env['PYTHONARCHDIR']=pyarchdir | |
333 | conf.env.PYTHONDIR=pydir | |
334 | conf.env.PYTHONARCHDIR=pyarchdir | |
329 | 335 | pyver_full='.'.join(map(str,pyver_tuple[:3])) |
330 | 336 | if minver is None: |
331 | 337 | conf.msg('Checking for python version',pyver_full) |
344 | 350 | ''' |
345 | 351 | @conf |
346 | 352 | def check_python_module(conf,module_name,condition=''): |
347 | msg="Checking for python module '%s'"%module_name | |
353 | msg="Checking for python module %r"%module_name | |
348 | 354 | if condition: |
349 | 355 | msg='%s (%s)'%(msg,condition) |
350 | 356 | conf.start_msg(msg) |
351 | 357 | try: |
352 | ret=conf.cmd_and_log(conf.env['PYTHON']+['-c',PYTHON_MODULE_TEMPLATE%module_name]) | |
358 | ret=conf.cmd_and_log(conf.env.PYTHON+['-c',PYTHON_MODULE_TEMPLATE%module_name]) | |
353 | 359 | except Exception: |
354 | 360 | conf.end_msg(False) |
355 | 361 | conf.fatal('Could not find the python module %r'%module_name) |
375 | 381 | conf.end_msg(ret) |
376 | 382 | def configure(conf): |
377 | 383 | v=conf.env |
378 | v['PYTHON']=Options.options.python or os.environ.get('PYTHON',sys.executable) | |
379 | 384 | if Options.options.pythondir: |
380 | v['PYTHONDIR']=Options.options.pythondir | |
385 | v.PYTHONDIR=Options.options.pythondir | |
381 | 386 | if Options.options.pythonarchdir: |
382 | v['PYTHONARCHDIR']=Options.options.pythonarchdir | |
383 | conf.find_program('python',var='PYTHON') | |
384 | v['PYFLAGS']='' | |
385 | v['PYFLAGS_OPT']='-O' | |
386 | v['PYC']=getattr(Options.options,'pyc',1) | |
387 | v['PYO']=getattr(Options.options,'pyo',1) | |
387 | v.PYTHONARCHDIR=Options.options.pythonarchdir | |
388 | conf.find_program('python',var='PYTHON',value=Options.options.python or sys.executable) | |
389 | v.PYFLAGS='' | |
390 | v.PYFLAGS_OPT='-O' | |
391 | v.PYC=getattr(Options.options,'pyc',1) | |
392 | v.PYO=getattr(Options.options,'pyo',1) | |
388 | 393 | try: |
389 | 394 | v.PYTAG=conf.cmd_and_log(conf.env.PYTHON+['-c',"import imp;print(imp.get_tag())"]).strip() |
390 | 395 | except Errors.WafError: |
0 | #! /usr/bin/env python | |
1 | # encoding: utf-8 | |
2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file | |
3 | ||
4 | try: | |
5 | from xml.sax import make_parser | |
6 | from xml.sax.handler import ContentHandler | |
7 | except ImportError: | |
8 | has_xml=False | |
9 | ContentHandler=object | |
10 | else: | |
11 | has_xml=True | |
12 | import os,sys | |
13 | from waflib.Tools import cxx | |
14 | from waflib import Task,Utils,Options,Errors,Context | |
15 | from waflib.TaskGen import feature,after_method,extension | |
16 | from waflib.Configure import conf | |
17 | from waflib import Logs | |
18 | MOC_H=['.h','.hpp','.hxx','.hh'] | |
19 | EXT_RCC=['.qrc'] | |
20 | EXT_UI=['.ui'] | |
21 | EXT_QT4=['.cpp','.cc','.cxx','.C'] | |
22 | QT4_LIBS="QtCore QtGui QtUiTools QtNetwork QtOpenGL QtSql QtSvg QtTest QtXml QtXmlPatterns QtWebKit Qt3Support QtHelp QtScript QtDeclarative QtDesigner" | |
23 | class qxx(Task.classes['cxx']): | |
24 | def __init__(self,*k,**kw): | |
25 | Task.Task.__init__(self,*k,**kw) | |
26 | self.moc_done=0 | |
27 | def runnable_status(self): | |
28 | if self.moc_done: | |
29 | return Task.Task.runnable_status(self) | |
30 | else: | |
31 | for t in self.run_after: | |
32 | if not t.hasrun: | |
33 | return Task.ASK_LATER | |
34 | self.add_moc_tasks() | |
35 | return Task.Task.runnable_status(self) | |
36 | def create_moc_task(self,h_node,m_node): | |
37 | try: | |
38 | moc_cache=self.generator.bld.moc_cache | |
39 | except AttributeError: | |
40 | moc_cache=self.generator.bld.moc_cache={} | |
41 | try: | |
42 | return moc_cache[h_node] | |
43 | except KeyError: | |
44 | tsk=moc_cache[h_node]=Task.classes['moc'](env=self.env,generator=self.generator) | |
45 | tsk.set_inputs(h_node) | |
46 | tsk.set_outputs(m_node) | |
47 | if self.generator: | |
48 | self.generator.tasks.append(tsk) | |
49 | gen=self.generator.bld.producer | |
50 | gen.outstanding.insert(0,tsk) | |
51 | gen.total+=1 | |
52 | return tsk | |
53 | def moc_h_ext(self): | |
54 | ext=[] | |
55 | try: | |
56 | ext=Options.options.qt_header_ext.split() | |
57 | except AttributeError: | |
58 | pass | |
59 | if not ext: | |
60 | ext=MOC_H | |
61 | return ext | |
62 | def add_moc_tasks(self): | |
63 | node=self.inputs[0] | |
64 | bld=self.generator.bld | |
65 | try: | |
66 | self.signature() | |
67 | except KeyError: | |
68 | pass | |
69 | else: | |
70 | delattr(self,'cache_sig') | |
71 | include_nodes=[node.parent]+self.generator.includes_nodes | |
72 | moctasks=[] | |
73 | mocfiles=set([]) | |
74 | for d in bld.raw_deps.get(self.uid(),[]): | |
75 | if not d.endswith('.moc'): | |
76 | continue | |
77 | if d in mocfiles: | |
78 | continue | |
79 | mocfiles.add(d) | |
80 | h_node=None | |
81 | base2=d[:-4] | |
82 | for x in include_nodes: | |
83 | for e in self.moc_h_ext(): | |
84 | h_node=x.find_node(base2+e) | |
85 | if h_node: | |
86 | break | |
87 | if h_node: | |
88 | m_node=h_node.change_ext('.moc') | |
89 | break | |
90 | else: | |
91 | for k in EXT_QT4: | |
92 | if base2.endswith(k): | |
93 | for x in include_nodes: | |
94 | h_node=x.find_node(base2) | |
95 | if h_node: | |
96 | break | |
97 | if h_node: | |
98 | m_node=h_node.change_ext(k+'.moc') | |
99 | break | |
100 | if not h_node: | |
101 | raise Errors.WafError('No source found for %r which is a moc file'%d) | |
102 | task=self.create_moc_task(h_node,m_node) | |
103 | moctasks.append(task) | |
104 | self.run_after.update(set(moctasks)) | |
105 | self.moc_done=1 | |
106 | class trans_update(Task.Task): | |
107 | run_str='${QT_LUPDATE} ${SRC} -ts ${TGT}' | |
108 | color='BLUE' | |
109 | Task.update_outputs(trans_update) | |
110 | class XMLHandler(ContentHandler): | |
111 | def __init__(self): | |
112 | self.buf=[] | |
113 | self.files=[] | |
114 | def startElement(self,name,attrs): | |
115 | if name=='file': | |
116 | self.buf=[] | |
117 | def endElement(self,name): | |
118 | if name=='file': | |
119 | self.files.append(str(''.join(self.buf))) | |
120 | def characters(self,cars): | |
121 | self.buf.append(cars) | |
122 | @extension(*EXT_RCC) | |
123 | def create_rcc_task(self,node): | |
124 | rcnode=node.change_ext('_rc.cpp') | |
125 | self.create_task('rcc',node,rcnode) | |
126 | cpptask=self.create_task('cxx',rcnode,rcnode.change_ext('.o')) | |
127 | try: | |
128 | self.compiled_tasks.append(cpptask) | |
129 | except AttributeError: | |
130 | self.compiled_tasks=[cpptask] | |
131 | return cpptask | |
132 | @extension(*EXT_UI) | |
133 | def create_uic_task(self,node): | |
134 | uictask=self.create_task('ui4',node) | |
135 | uictask.outputs=[self.path.find_or_declare(self.env['ui_PATTERN']%node.name[:-3])] | |
136 | @extension('.ts') | |
137 | def add_lang(self,node): | |
138 | self.lang=self.to_list(getattr(self,'lang',[]))+[node] | |
139 | @feature('qt4') | |
140 | @after_method('apply_link') | |
141 | def apply_qt4(self): | |
142 | if getattr(self,'lang',None): | |
143 | qmtasks=[] | |
144 | for x in self.to_list(self.lang): | |
145 | if isinstance(x,str): | |
146 | x=self.path.find_resource(x+'.ts') | |
147 | qmtasks.append(self.create_task('ts2qm',x,x.change_ext('.qm'))) | |
148 | if getattr(self,'update',None)and Options.options.trans_qt4: | |
149 | cxxnodes=[a.inputs[0]for a in self.compiled_tasks]+[a.inputs[0]for a in self.tasks if getattr(a,'inputs',None)and a.inputs[0].name.endswith('.ui')] | |
150 | for x in qmtasks: | |
151 | self.create_task('trans_update',cxxnodes,x.inputs) | |
152 | if getattr(self,'langname',None): | |
153 | qmnodes=[x.outputs[0]for x in qmtasks] | |
154 | rcnode=self.langname | |
155 | if isinstance(rcnode,str): | |
156 | rcnode=self.path.find_or_declare(rcnode+'.qrc') | |
157 | t=self.create_task('qm2rcc',qmnodes,rcnode) | |
158 | k=create_rcc_task(self,t.outputs[0]) | |
159 | self.link_task.inputs.append(k.outputs[0]) | |
160 | lst=[] | |
161 | for flag in self.to_list(self.env['CXXFLAGS']): | |
162 | if len(flag)<2:continue | |
163 | f=flag[0:2] | |
164 | if f in('-D','-I','/D','/I'): | |
165 | if(f[0]=='/'): | |
166 | lst.append('-'+flag[1:]) | |
167 | else: | |
168 | lst.append(flag) | |
169 | self.env.append_value('MOC_FLAGS',lst) | |
170 | @extension(*EXT_QT4) | |
171 | def cxx_hook(self,node): | |
172 | return self.create_compiled_task('qxx',node) | |
173 | class rcc(Task.Task): | |
174 | color='BLUE' | |
175 | run_str='${QT_RCC} -name ${tsk.rcname()} ${SRC[0].abspath()} ${RCC_ST} -o ${TGT}' | |
176 | ext_out=['.h'] | |
177 | def rcname(self): | |
178 | return os.path.splitext(self.inputs[0].name)[0] | |
179 | def scan(self): | |
180 | if not has_xml: | |
181 | Logs.error('no xml support was found, the rcc dependencies will be incomplete!') | |
182 | return([],[]) | |
183 | parser=make_parser() | |
184 | curHandler=XMLHandler() | |
185 | parser.setContentHandler(curHandler) | |
186 | fi=open(self.inputs[0].abspath(),'r') | |
187 | try: | |
188 | parser.parse(fi) | |
189 | finally: | |
190 | fi.close() | |
191 | nodes=[] | |
192 | names=[] | |
193 | root=self.inputs[0].parent | |
194 | for x in curHandler.files: | |
195 | nd=root.find_resource(x) | |
196 | if nd:nodes.append(nd) | |
197 | else:names.append(x) | |
198 | return(nodes,names) | |
199 | class moc(Task.Task): | |
200 | color='BLUE' | |
201 | run_str='${QT_MOC} ${MOC_FLAGS} ${MOCCPPPATH_ST:INCPATHS} ${MOCDEFINES_ST:DEFINES} ${SRC} ${MOC_ST} ${TGT}' | |
202 | def keyword(self): | |
203 | return"Creating" | |
204 | def __str__(self): | |
205 | return self.outputs[0].path_from(self.generator.bld.launch_node()) | |
206 | class ui4(Task.Task): | |
207 | color='BLUE' | |
208 | run_str='${QT_UIC} ${SRC} -o ${TGT}' | |
209 | ext_out=['.h'] | |
210 | class ts2qm(Task.Task): | |
211 | color='BLUE' | |
212 | run_str='${QT_LRELEASE} ${QT_LRELEASE_FLAGS} ${SRC} -qm ${TGT}' | |
213 | class qm2rcc(Task.Task): | |
214 | color='BLUE' | |
215 | after='ts2qm' | |
216 | def run(self): | |
217 | txt='\n'.join(['<file>%s</file>'%k.path_from(self.outputs[0].parent)for k in self.inputs]) | |
218 | code='<!DOCTYPE RCC><RCC version="1.0">\n<qresource>\n%s\n</qresource>\n</RCC>'%txt | |
219 | self.outputs[0].write(code) | |
220 | def configure(self): | |
221 | self.find_qt4_binaries() | |
222 | self.set_qt4_libs_to_check() | |
223 | self.set_qt4_defines() | |
224 | self.find_qt4_libraries() | |
225 | self.add_qt4_rpath() | |
226 | self.simplify_qt4_libs() | |
227 | @conf | |
228 | def find_qt4_binaries(self): | |
229 | env=self.env | |
230 | opt=Options.options | |
231 | qtdir=getattr(opt,'qtdir','') | |
232 | qtbin=getattr(opt,'qtbin','') | |
233 | paths=[] | |
234 | if qtdir: | |
235 | qtbin=os.path.join(qtdir,'bin') | |
236 | if not qtdir: | |
237 | qtdir=os.environ.get('QT4_ROOT','') | |
238 | qtbin=os.environ.get('QT4_BIN',None)or os.path.join(qtdir,'bin') | |
239 | if qtbin: | |
240 | paths=[qtbin] | |
241 | if not qtdir: | |
242 | paths=os.environ.get('PATH','').split(os.pathsep) | |
243 | paths.append('/usr/share/qt4/bin/') | |
244 | try: | |
245 | lst=Utils.listdir('/usr/local/Trolltech/') | |
246 | except OSError: | |
247 | pass | |
248 | else: | |
249 | if lst: | |
250 | lst.sort() | |
251 | lst.reverse() | |
252 | qtdir='/usr/local/Trolltech/%s/'%lst[0] | |
253 | qtbin=os.path.join(qtdir,'bin') | |
254 | paths.append(qtbin) | |
255 | cand=None | |
256 | prev_ver=['4','0','0'] | |
257 | for qmk in('qmake-qt4','qmake4','qmake'): | |
258 | try: | |
259 | qmake=self.find_program(qmk,path_list=paths) | |
260 | except self.errors.ConfigurationError: | |
261 | pass | |
262 | else: | |
263 | try: | |
264 | version=self.cmd_and_log(qmake+['-query','QT_VERSION']).strip() | |
265 | except self.errors.WafError: | |
266 | pass | |
267 | else: | |
268 | if version: | |
269 | new_ver=version.split('.') | |
270 | if new_ver>prev_ver: | |
271 | cand=qmake | |
272 | prev_ver=new_ver | |
273 | if cand: | |
274 | self.env.QMAKE=cand | |
275 | else: | |
276 | self.fatal('Could not find qmake for qt4') | |
277 | qtbin=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_BINS']).strip()+os.sep | |
278 | def find_bin(lst,var): | |
279 | if var in env: | |
280 | return | |
281 | for f in lst: | |
282 | try: | |
283 | ret=self.find_program(f,path_list=paths) | |
284 | except self.errors.ConfigurationError: | |
285 | pass | |
286 | else: | |
287 | env[var]=ret | |
288 | break | |
289 | find_bin(['uic-qt3','uic3'],'QT_UIC3') | |
290 | find_bin(['uic-qt4','uic'],'QT_UIC') | |
291 | if not env.QT_UIC: | |
292 | self.fatal('cannot find the uic compiler for qt4') | |
293 | self.start_msg('Checking for uic version') | |
294 | uicver=self.cmd_and_log(env.QT_UIC+["-version"],output=Context.BOTH) | |
295 | uicver=''.join(uicver).strip() | |
296 | uicver=uicver.replace('Qt User Interface Compiler ','').replace('User Interface Compiler for Qt','') | |
297 | self.end_msg(uicver) | |
298 | if uicver.find(' 3.')!=-1: | |
299 | self.fatal('this uic compiler is for qt3, add uic for qt4 to your path') | |
300 | find_bin(['moc-qt4','moc'],'QT_MOC') | |
301 | find_bin(['rcc-qt4','rcc'],'QT_RCC') | |
302 | find_bin(['lrelease-qt4','lrelease'],'QT_LRELEASE') | |
303 | find_bin(['lupdate-qt4','lupdate'],'QT_LUPDATE') | |
304 | env['UIC3_ST']='%s -o %s' | |
305 | env['UIC_ST']='%s -o %s' | |
306 | env['MOC_ST']='-o' | |
307 | env['ui_PATTERN']='ui_%s.h' | |
308 | env['QT_LRELEASE_FLAGS']=['-silent'] | |
309 | env.MOCCPPPATH_ST='-I%s' | |
310 | env.MOCDEFINES_ST='-D%s' | |
311 | @conf | |
312 | def find_qt4_libraries(self): | |
313 | qtlibs=getattr(Options.options,'qtlibs',None)or os.environ.get("QT4_LIBDIR",None) | |
314 | if not qtlibs: | |
315 | try: | |
316 | qtlibs=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_LIBS']).strip() | |
317 | except Errors.WafError: | |
318 | qtdir=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_PREFIX']).strip()+os.sep | |
319 | qtlibs=os.path.join(qtdir,'lib') | |
320 | self.msg('Found the Qt4 libraries in',qtlibs) | |
321 | qtincludes=os.environ.get("QT4_INCLUDES",None)or self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_HEADERS']).strip() | |
322 | env=self.env | |
323 | if not'PKG_CONFIG_PATH'in os.environ: | |
324 | os.environ['PKG_CONFIG_PATH']='%s:%s/pkgconfig:/usr/lib/qt4/lib/pkgconfig:/opt/qt4/lib/pkgconfig:/usr/lib/qt4/lib:/opt/qt4/lib'%(qtlibs,qtlibs) | |
325 | try: | |
326 | if os.environ.get("QT4_XCOMPILE",None): | |
327 | raise self.errors.ConfigurationError() | |
328 | self.check_cfg(atleast_pkgconfig_version='0.1') | |
329 | except self.errors.ConfigurationError: | |
330 | for i in self.qt4_vars: | |
331 | uselib=i.upper() | |
332 | if Utils.unversioned_sys_platform()=="darwin": | |
333 | frameworkName=i+".framework" | |
334 | qtDynamicLib=os.path.join(qtlibs,frameworkName,i) | |
335 | if os.path.exists(qtDynamicLib): | |
336 | env.append_unique('FRAMEWORK_'+uselib,i) | |
337 | self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') | |
338 | else: | |
339 | self.msg('Checking for %s'%i,False,'YELLOW') | |
340 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtlibs,frameworkName,'Headers')) | |
341 | elif env.DEST_OS!="win32": | |
342 | qtDynamicLib=os.path.join(qtlibs,"lib"+i+".so") | |
343 | qtStaticLib=os.path.join(qtlibs,"lib"+i+".a") | |
344 | if os.path.exists(qtDynamicLib): | |
345 | env.append_unique('LIB_'+uselib,i) | |
346 | self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') | |
347 | elif os.path.exists(qtStaticLib): | |
348 | env.append_unique('LIB_'+uselib,i) | |
349 | self.msg('Checking for %s'%i,qtStaticLib,'GREEN') | |
350 | else: | |
351 | self.msg('Checking for %s'%i,False,'YELLOW') | |
352 | env.append_unique('LIBPATH_'+uselib,qtlibs) | |
353 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
354 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) | |
355 | else: | |
356 | for k in("lib%s.a","lib%s4.a","%s.lib","%s4.lib"): | |
357 | lib=os.path.join(qtlibs,k%i) | |
358 | if os.path.exists(lib): | |
359 | env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) | |
360 | self.msg('Checking for %s'%i,lib,'GREEN') | |
361 | break | |
362 | else: | |
363 | self.msg('Checking for %s'%i,False,'YELLOW') | |
364 | env.append_unique('LIBPATH_'+uselib,qtlibs) | |
365 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
366 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) | |
367 | uselib=i.upper()+"_debug" | |
368 | for k in("lib%sd.a","lib%sd4.a","%sd.lib","%sd4.lib"): | |
369 | lib=os.path.join(qtlibs,k%i) | |
370 | if os.path.exists(lib): | |
371 | env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) | |
372 | self.msg('Checking for %s'%i,lib,'GREEN') | |
373 | break | |
374 | else: | |
375 | self.msg('Checking for %s'%i,False,'YELLOW') | |
376 | env.append_unique('LIBPATH_'+uselib,qtlibs) | |
377 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
378 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) | |
379 | else: | |
380 | for i in self.qt4_vars_debug+self.qt4_vars: | |
381 | self.check_cfg(package=i,args='--cflags --libs',mandatory=False) | |
382 | @conf | |
383 | def simplify_qt4_libs(self): | |
384 | env=self.env | |
385 | def process_lib(vars_,coreval): | |
386 | for d in vars_: | |
387 | var=d.upper() | |
388 | if var=='QTCORE': | |
389 | continue | |
390 | value=env['LIBPATH_'+var] | |
391 | if value: | |
392 | core=env[coreval] | |
393 | accu=[] | |
394 | for lib in value: | |
395 | if lib in core: | |
396 | continue | |
397 | accu.append(lib) | |
398 | env['LIBPATH_'+var]=accu | |
399 | process_lib(self.qt4_vars,'LIBPATH_QTCORE') | |
400 | process_lib(self.qt4_vars_debug,'LIBPATH_QTCORE_DEBUG') | |
401 | @conf | |
402 | def add_qt4_rpath(self): | |
403 | env=self.env | |
404 | if getattr(Options.options,'want_rpath',False): | |
405 | def process_rpath(vars_,coreval): | |
406 | for d in vars_: | |
407 | var=d.upper() | |
408 | value=env['LIBPATH_'+var] | |
409 | if value: | |
410 | core=env[coreval] | |
411 | accu=[] | |
412 | for lib in value: | |
413 | if var!='QTCORE': | |
414 | if lib in core: | |
415 | continue | |
416 | accu.append('-Wl,--rpath='+lib) | |
417 | env['RPATH_'+var]=accu | |
418 | process_rpath(self.qt4_vars,'LIBPATH_QTCORE') | |
419 | process_rpath(self.qt4_vars_debug,'LIBPATH_QTCORE_DEBUG') | |
420 | @conf | |
421 | def set_qt4_libs_to_check(self): | |
422 | if not hasattr(self,'qt4_vars'): | |
423 | self.qt4_vars=QT4_LIBS | |
424 | self.qt4_vars=Utils.to_list(self.qt4_vars) | |
425 | if not hasattr(self,'qt4_vars_debug'): | |
426 | self.qt4_vars_debug=[a+'_debug'for a in self.qt4_vars] | |
427 | self.qt4_vars_debug=Utils.to_list(self.qt4_vars_debug) | |
428 | @conf | |
429 | def set_qt4_defines(self): | |
430 | if sys.platform!='win32': | |
431 | return | |
432 | for x in self.qt4_vars: | |
433 | y=x[2:].upper() | |
434 | self.env.append_unique('DEFINES_%s'%x.upper(),'QT_%s_LIB'%y) | |
435 | self.env.append_unique('DEFINES_%s_DEBUG'%x.upper(),'QT_%s_LIB'%y) | |
436 | def options(opt): | |
437 | opt.add_option('--want-rpath',action='store_true',default=False,dest='want_rpath',help='enable the rpath for qt libraries') | |
438 | opt.add_option('--header-ext',type='string',default='',help='header extension for moc files',dest='qt_header_ext') | |
439 | for i in'qtdir qtbin qtlibs'.split(): | |
440 | opt.add_option('--'+i,type='string',default='',dest=i) | |
441 | opt.add_option('--translate',action="store_true",help="collect translation strings",dest="trans_qt4",default=False) |
12 | 12 | import os,sys |
13 | 13 | from waflib.Tools import cxx |
14 | 14 | from waflib import Task,Utils,Options,Errors,Context |
15 | from waflib.TaskGen import feature,after_method,extension | |
15 | from waflib.TaskGen import feature,after_method,extension,before_method | |
16 | 16 | from waflib.Configure import conf |
17 | 17 | from waflib import Logs |
18 | 18 | MOC_H=['.h','.hpp','.hxx','.hh'] |
21 | 21 | EXT_QT5=['.cpp','.cc','.cxx','.C'] |
22 | 22 | QT5_LIBS=''' |
23 | 23 | qtmain |
24 | Qt53DCore | |
25 | Qt53DExtras | |
26 | Qt53DInput | |
27 | Qt53DLogic | |
28 | Qt53DQuickExtras | |
29 | Qt53DQuickInput | |
30 | Qt53DQuickRender | |
31 | Qt53DQuick.so | |
32 | Qt53DRender | |
24 | 33 | Qt5Bluetooth |
34 | Qt5Charts | |
25 | 35 | Qt5CLucene |
26 | 36 | Qt5Concurrent |
27 | 37 | Qt5Core |
38 | Qt5DataVisualization | |
28 | 39 | Qt5DBus |
29 | 40 | Qt5Declarative |
30 | 41 | Qt5DesignerComponents |
31 | 42 | Qt5Designer |
43 | Qt5EglDeviceIntegration | |
44 | Qt5Gamepad | |
32 | 45 | Qt5Gui |
33 | 46 | Qt5Help |
47 | Qt5Location | |
34 | 48 | Qt5MultimediaQuick_p |
35 | 49 | Qt5Multimedia |
36 | 50 | Qt5MultimediaWidgets |
39 | 53 | Qt5OpenGL |
40 | 54 | Qt5Positioning |
41 | 55 | Qt5PrintSupport |
56 | Qt5Purchasing | |
42 | 57 | Qt5Qml |
58 | Qt5QuickControls2 | |
43 | 59 | Qt5QuickParticles |
44 | 60 | Qt5Quick |
61 | Qt5QuickTemplates2 | |
45 | 62 | Qt5QuickTest |
63 | Qt5QuickWidgets | |
46 | 64 | Qt5Script |
47 | 65 | Qt5ScriptTools |
66 | Qt5Scxml | |
48 | 67 | Qt5Sensors |
68 | Qt5SerialBus | |
49 | 69 | Qt5SerialPort |
50 | 70 | Qt5Sql |
51 | 71 | Qt5Svg |
52 | 72 | Qt5Test |
73 | Qt5WebChannel | |
74 | Qt5WebEngineCore | |
75 | Qt5WebEngine | |
76 | Qt5WebEngineWidgets | |
53 | 77 | Qt5WebKit |
54 | 78 | Qt5WebKitWidgets |
79 | Qt5WebSockets | |
80 | Qt5WebView | |
55 | 81 | Qt5Widgets |
56 | 82 | Qt5WinExtras |
57 | 83 | Qt5X11Extras |
84 | Qt5XcbQpa | |
58 | 85 | Qt5XmlPatterns |
59 | 86 | Qt5Xml''' |
60 | 87 | class qxx(Task.classes['cxx']): |
81 | 108 | tsk=moc_cache[h_node]=Task.classes['moc'](env=self.env,generator=self.generator) |
82 | 109 | tsk.set_inputs(h_node) |
83 | 110 | tsk.set_outputs(m_node) |
111 | tsk.env.append_unique('MOC_FLAGS','-i') | |
84 | 112 | if self.generator: |
85 | 113 | self.generator.tasks.append(tsk) |
86 | 114 | gen=self.generator.bld.producer |
87 | gen.outstanding.insert(0,tsk) | |
115 | gen.outstanding.appendleft(tsk) | |
88 | 116 | gen.total+=1 |
89 | 117 | return tsk |
90 | 118 | else: |
91 | 119 | delattr(self,'cache_sig') |
92 | def moc_h_ext(self): | |
93 | ext=[] | |
94 | try: | |
95 | ext=Options.options.qt_header_ext.split() | |
96 | except AttributeError: | |
97 | pass | |
98 | if not ext: | |
99 | ext=MOC_H | |
100 | return ext | |
101 | 120 | def add_moc_tasks(self): |
102 | 121 | node=self.inputs[0] |
103 | 122 | bld=self.generator.bld |
109 | 128 | delattr(self,'cache_sig') |
110 | 129 | include_nodes=[node.parent]+self.generator.includes_nodes |
111 | 130 | moctasks=[] |
112 | mocfiles=set([]) | |
131 | mocfiles=set() | |
113 | 132 | for d in bld.raw_deps.get(self.uid(),[]): |
114 | 133 | if not d.endswith('.moc'): |
115 | 134 | continue |
118 | 137 | mocfiles.add(d) |
119 | 138 | h_node=None |
120 | 139 | base2=d[:-4] |
121 | for x in include_nodes: | |
122 | for e in self.moc_h_ext(): | |
123 | h_node=x.find_node(base2+e) | |
124 | if h_node: | |
125 | break | |
126 | if h_node: | |
127 | m_node=h_node.change_ext('.moc') | |
140 | prefix=node.name[:node.name.rfind('.')] | |
141 | if base2==prefix: | |
142 | h_node=node | |
143 | else: | |
144 | for x in include_nodes: | |
145 | for e in MOC_H: | |
146 | h_node=x.find_node(base2+e) | |
147 | if h_node: | |
148 | break | |
149 | else: | |
150 | continue | |
128 | 151 | break |
129 | else: | |
130 | for k in EXT_QT5: | |
131 | if base2.endswith(k): | |
132 | for x in include_nodes: | |
133 | h_node=x.find_node(base2) | |
134 | if h_node: | |
135 | break | |
136 | if h_node: | |
137 | m_node=h_node.change_ext(k+'.moc') | |
138 | break | |
139 | if not h_node: | |
152 | if h_node: | |
153 | m_node=h_node.change_ext('.moc') | |
154 | else: | |
140 | 155 | raise Errors.WafError('No source found for %r which is a moc file'%d) |
141 | 156 | task=self.create_moc_task(h_node,m_node) |
142 | 157 | moctasks.append(task) |
145 | 160 | class trans_update(Task.Task): |
146 | 161 | run_str='${QT_LUPDATE} ${SRC} -ts ${TGT}' |
147 | 162 | color='BLUE' |
148 | Task.update_outputs(trans_update) | |
149 | 163 | class XMLHandler(ContentHandler): |
150 | 164 | def __init__(self): |
151 | 165 | self.buf=[] |
171 | 185 | @extension(*EXT_UI) |
172 | 186 | def create_uic_task(self,node): |
173 | 187 | uictask=self.create_task('ui5',node) |
174 | uictask.outputs=[self.path.find_or_declare(self.env['ui_PATTERN']%node.name[:-3])] | |
188 | uictask.outputs=[node.parent.find_or_declare(self.env.ui_PATTERN%node.name[:-3])] | |
175 | 189 | @extension('.ts') |
176 | 190 | def add_lang(self,node): |
177 | 191 | self.lang=self.to_list(getattr(self,'lang',[]))+[node] |
192 | @feature('qt5') | |
193 | @before_method('process_source') | |
194 | def process_mocs(self): | |
195 | lst=self.to_nodes(getattr(self,'moc',[])) | |
196 | self.source=self.to_list(getattr(self,'source',[])) | |
197 | for x in lst: | |
198 | prefix=x.name[:x.name.rfind('.')] | |
199 | moc_target='moc_%s.%d.cpp'%(prefix,self.idx) | |
200 | moc_node=x.parent.find_or_declare(moc_target) | |
201 | self.source.append(moc_node) | |
202 | self.create_task('moc',x,moc_node) | |
178 | 203 | @feature('qt5') |
179 | 204 | @after_method('apply_link') |
180 | 205 | def apply_qt5(self): |
197 | 222 | k=create_rcc_task(self,t.outputs[0]) |
198 | 223 | self.link_task.inputs.append(k.outputs[0]) |
199 | 224 | lst=[] |
200 | for flag in self.to_list(self.env['CXXFLAGS']): | |
225 | for flag in self.to_list(self.env.CXXFLAGS): | |
201 | 226 | if len(flag)<2:continue |
202 | 227 | f=flag[0:2] |
203 | 228 | if f in('-D','-I','/D','/I'): |
217 | 242 | return os.path.splitext(self.inputs[0].name)[0] |
218 | 243 | def scan(self): |
219 | 244 | if not has_xml: |
220 | Logs.error('no xml support was found, the rcc dependencies will be incomplete!') | |
245 | Logs.error('No xml.sax support was found, rcc dependencies will be incomplete!') | |
221 | 246 | return([],[]) |
222 | 247 | parser=make_parser() |
223 | 248 | curHandler=XMLHandler() |
259 | 284 | self.find_qt5_libraries() |
260 | 285 | self.add_qt5_rpath() |
261 | 286 | self.simplify_qt5_libs() |
287 | if not has_xml: | |
288 | Logs.error('No xml.sax support was found, rcc dependencies will be incomplete!') | |
289 | if'COMPILER_CXX'not in self.env: | |
290 | self.fatal('No CXX compiler defined: did you forget to configure compiler_cxx first?') | |
291 | frag='#include <QApplication>\nint main(int argc, char **argv) {return 0;}\n' | |
292 | uses='QT5CORE QT5WIDGETS QT5GUI' | |
293 | for flag in[[],'-fPIE','-fPIC','-std=c++11',['-std=c++11','-fPIE'],['-std=c++11','-fPIC']]: | |
294 | msg='See if Qt files compile ' | |
295 | if flag: | |
296 | msg+='with %s'%flag | |
297 | try: | |
298 | self.check(features='qt5 cxx',use=uses,uselib_store='qt5',cxxflags=flag,fragment=frag,msg=msg) | |
299 | except self.errors.ConfigurationError: | |
300 | pass | |
301 | else: | |
302 | break | |
303 | else: | |
304 | self.fatal('Could not build a simple Qt application') | |
305 | from waflib import Utils | |
306 | if Utils.unversioned_sys_platform()=='freebsd': | |
307 | frag='#include <QApplication>\nint main(int argc, char **argv) { QApplication app(argc, argv); return NULL != (void*) (&app);}\n' | |
308 | try: | |
309 | self.check(features='qt5 cxx cxxprogram',use=uses,fragment=frag,msg='Can we link Qt programs on FreeBSD directly?') | |
310 | except self.errors.ConfigurationError: | |
311 | self.check(features='qt5 cxx cxxprogram',use=uses,uselib_store='qt5',libpath='/usr/local/lib',fragment=frag,msg='Is /usr/local/lib required?') | |
262 | 312 | @conf |
263 | 313 | def find_qt5_binaries(self): |
264 | 314 | env=self.env |
269 | 319 | if qtdir: |
270 | 320 | qtbin=os.path.join(qtdir,'bin') |
271 | 321 | if not qtdir: |
272 | qtdir=os.environ.get('QT5_ROOT','') | |
273 | qtbin=os.environ.get('QT5_BIN',None)or os.path.join(qtdir,'bin') | |
322 | qtdir=self.environ.get('QT5_ROOT','') | |
323 | qtbin=self.environ.get('QT5_BIN')or os.path.join(qtdir,'bin') | |
274 | 324 | if qtbin: |
275 | 325 | paths=[qtbin] |
276 | 326 | if not qtdir: |
277 | paths=os.environ.get('PATH','').split(os.pathsep) | |
278 | paths.append('/usr/share/qt5/bin/') | |
327 | paths=self.environ.get('PATH','').split(os.pathsep) | |
328 | paths.extend(['/usr/share/qt5/bin','/usr/local/lib/qt5/bin']) | |
279 | 329 | try: |
280 | 330 | lst=Utils.listdir('/usr/local/Trolltech/') |
281 | 331 | except OSError: |
322 | 372 | self.env.QMAKE=cand |
323 | 373 | else: |
324 | 374 | self.fatal('Could not find qmake for qt5') |
325 | self.env.QT_INSTALL_BINS=qtbin=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_BINS']).strip()+os.sep | |
375 | self.env.QT_HOST_BINS=qtbin=self.cmd_and_log(self.env.QMAKE+['-query','QT_HOST_BINS']).strip() | |
326 | 376 | paths.insert(0,qtbin) |
327 | 377 | def find_bin(lst,var): |
328 | 378 | if var in env: |
349 | 399 | find_bin(['rcc-qt5','rcc'],'QT_RCC') |
350 | 400 | find_bin(['lrelease-qt5','lrelease'],'QT_LRELEASE') |
351 | 401 | find_bin(['lupdate-qt5','lupdate'],'QT_LUPDATE') |
352 | env['UIC_ST']='%s -o %s' | |
353 | env['MOC_ST']='-o' | |
354 | env['ui_PATTERN']='ui_%s.h' | |
355 | env['QT_LRELEASE_FLAGS']=['-silent'] | |
402 | env.UIC_ST='%s -o %s' | |
403 | env.MOC_ST='-o' | |
404 | env.ui_PATTERN='ui_%s.h' | |
405 | env.QT_LRELEASE_FLAGS=['-silent'] | |
356 | 406 | env.MOCCPPPATH_ST='-I%s' |
357 | 407 | env.MOCDEFINES_ST='-D%s' |
358 | 408 | @conf |
409 | def find_single_qt5_lib(self,name,uselib,qtlibs,qtincludes,force_static): | |
410 | env=self.env | |
411 | if force_static: | |
412 | exts=('.a','.lib') | |
413 | prefix='STLIB' | |
414 | else: | |
415 | exts=('.so','.lib') | |
416 | prefix='LIB' | |
417 | def lib_names(): | |
418 | for x in exts: | |
419 | for k in('','5')if Utils.is_win32 else['']: | |
420 | for p in('lib',''): | |
421 | yield(p,name,k,x) | |
422 | raise StopIteration | |
423 | for tup in lib_names(): | |
424 | k=''.join(tup) | |
425 | path=os.path.join(qtlibs,k) | |
426 | if os.path.exists(path): | |
427 | if env.DEST_OS=='win32': | |
428 | libval=''.join(tup[:-1]) | |
429 | else: | |
430 | libval=name | |
431 | env.append_unique(prefix+'_'+uselib,libval) | |
432 | env.append_unique('%sPATH_%s'%(prefix,uselib),qtlibs) | |
433 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
434 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,name.replace('Qt5','Qt'))) | |
435 | return k | |
436 | return False | |
437 | @conf | |
359 | 438 | def find_qt5_libraries(self): |
360 | qtlibs=getattr(Options.options,'qtlibs',None)or os.environ.get("QT5_LIBDIR",None) | |
439 | env=self.env | |
440 | qtlibs=getattr(Options.options,'qtlibs',None)or self.environ.get('QT5_LIBDIR') | |
361 | 441 | if not qtlibs: |
362 | 442 | try: |
363 | qtlibs=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_LIBS']).strip() | |
443 | qtlibs=self.cmd_and_log(env.QMAKE+['-query','QT_INSTALL_LIBS']).strip() | |
364 | 444 | except Errors.WafError: |
365 | qtdir=self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_PREFIX']).strip()+os.sep | |
445 | qtdir=self.cmd_and_log(env.QMAKE+['-query','QT_INSTALL_PREFIX']).strip() | |
366 | 446 | qtlibs=os.path.join(qtdir,'lib') |
367 | 447 | self.msg('Found the Qt5 libraries in',qtlibs) |
368 | qtincludes=os.environ.get("QT5_INCLUDES",None)or self.cmd_and_log(self.env.QMAKE+['-query','QT_INSTALL_HEADERS']).strip() | |
369 | env=self.env | |
370 | if not'PKG_CONFIG_PATH'in os.environ: | |
371 | os.environ['PKG_CONFIG_PATH']='%s:%s/pkgconfig:/usr/lib/qt5/lib/pkgconfig:/opt/qt5/lib/pkgconfig:/usr/lib/qt5/lib:/opt/qt5/lib'%(qtlibs,qtlibs) | |
448 | qtincludes=self.environ.get('QT5_INCLUDES')or self.cmd_and_log(env.QMAKE+['-query','QT_INSTALL_HEADERS']).strip() | |
449 | force_static=self.environ.get('QT5_FORCE_STATIC') | |
372 | 450 | try: |
373 | if os.environ.get("QT5_XCOMPILE",None): | |
374 | raise self.errors.ConfigurationError() | |
451 | if self.environ.get('QT5_XCOMPILE'): | |
452 | self.fatal('QT5_XCOMPILE Disables pkg-config detection') | |
375 | 453 | self.check_cfg(atleast_pkgconfig_version='0.1') |
376 | 454 | except self.errors.ConfigurationError: |
377 | 455 | for i in self.qt5_vars: |
378 | 456 | uselib=i.upper() |
379 | if Utils.unversioned_sys_platform()=="darwin": | |
380 | frameworkName=i+".framework" | |
457 | if Utils.unversioned_sys_platform()=='darwin': | |
458 | frameworkName=i+'.framework' | |
381 | 459 | qtDynamicLib=os.path.join(qtlibs,frameworkName,i) |
382 | 460 | if os.path.exists(qtDynamicLib): |
383 | 461 | env.append_unique('FRAMEWORK_'+uselib,i) |
385 | 463 | else: |
386 | 464 | self.msg('Checking for %s'%i,False,'YELLOW') |
387 | 465 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtlibs,frameworkName,'Headers')) |
388 | elif env.DEST_OS!="win32": | |
389 | qtDynamicLib=os.path.join(qtlibs,"lib"+i+".so") | |
390 | qtStaticLib=os.path.join(qtlibs,"lib"+i+".a") | |
391 | if os.path.exists(qtDynamicLib): | |
392 | env.append_unique('LIB_'+uselib,i) | |
393 | self.msg('Checking for %s'%i,qtDynamicLib,'GREEN') | |
394 | elif os.path.exists(qtStaticLib): | |
395 | env.append_unique('LIB_'+uselib,i) | |
396 | self.msg('Checking for %s'%i,qtStaticLib,'GREEN') | |
397 | else: | |
398 | self.msg('Checking for %s'%i,False,'YELLOW') | |
399 | env.append_unique('LIBPATH_'+uselib,qtlibs) | |
400 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
401 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i)) | |
402 | else: | |
403 | for k in("lib%s.a","lib%s5.a","%s.lib","%s5.lib"): | |
404 | lib=os.path.join(qtlibs,k%i) | |
405 | if os.path.exists(lib): | |
406 | env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) | |
407 | self.msg('Checking for %s'%i,lib,'GREEN') | |
408 | break | |
409 | else: | |
410 | self.msg('Checking for %s'%i,False,'YELLOW') | |
411 | env.append_unique('LIBPATH_'+uselib,qtlibs) | |
412 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
413 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i.replace('Qt5','Qt'))) | |
414 | uselib=i.upper()+"_debug" | |
415 | for k in("lib%sd.a","lib%sd5.a","%sd.lib","%sd5.lib"): | |
416 | lib=os.path.join(qtlibs,k%i) | |
417 | if os.path.exists(lib): | |
418 | env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')]) | |
419 | self.msg('Checking for %s'%i,lib,'GREEN') | |
420 | break | |
421 | else: | |
422 | self.msg('Checking for %s'%i,False,'YELLOW') | |
423 | env.append_unique('LIBPATH_'+uselib,qtlibs) | |
424 | env.append_unique('INCLUDES_'+uselib,qtincludes) | |
425 | env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i.replace('Qt5','Qt'))) | |
466 | else: | |
467 | for j in('','d'): | |
468 | k='_DEBUG'if j=='d'else'' | |
469 | ret=self.find_single_qt5_lib(i+j,uselib+k,qtlibs,qtincludes,force_static) | |
470 | if not force_static and not ret: | |
471 | ret=self.find_single_qt5_lib(i+j,uselib+k,qtlibs,qtincludes,True) | |
472 | self.msg('Checking for %s'%(i+j),ret,'GREEN'if ret else'YELLOW') | |
426 | 473 | else: |
474 | path='%s:%s:%s/pkgconfig:/usr/lib/qt5/lib/pkgconfig:/opt/qt5/lib/pkgconfig:/usr/lib/qt5/lib:/opt/qt5/lib'%(self.environ.get('PKG_CONFIG_PATH',''),qtlibs,qtlibs) | |
427 | 475 | for i in self.qt5_vars_debug+self.qt5_vars: |
428 | self.check_cfg(package=i,args='--cflags --libs',mandatory=False) | |
476 | self.check_cfg(package=i,args='--cflags --libs',mandatory=False,force_static=force_static,pkg_config_path=path) | |
429 | 477 | @conf |
430 | 478 | def simplify_qt5_libs(self): |
431 | 479 | env=self.env |
470 | 518 | self.qt5_vars=QT5_LIBS |
471 | 519 | self.qt5_vars=Utils.to_list(self.qt5_vars) |
472 | 520 | if not hasattr(self,'qt5_vars_debug'): |
473 | self.qt5_vars_debug=[a+'_debug'for a in self.qt5_vars] | |
521 | self.qt5_vars_debug=[a+'_DEBUG'for a in self.qt5_vars] | |
474 | 522 | self.qt5_vars_debug=Utils.to_list(self.qt5_vars_debug) |
475 | 523 | @conf |
476 | 524 | def set_qt5_defines(self): |
482 | 530 | self.env.append_unique('DEFINES_%s_DEBUG'%x.upper(),'QT_%s_LIB'%y) |
483 | 531 | def options(opt): |
484 | 532 | opt.add_option('--want-rpath',action='store_true',default=False,dest='want_rpath',help='enable the rpath for qt libraries') |
485 | opt.add_option('--header-ext',type='string',default='',help='header extension for moc files',dest='qt_header_ext') | |
486 | 533 | for i in'qtdir qtbin qtlibs'.split(): |
487 | 534 | opt.add_option('--'+i,type='string',default='',dest=i) |
488 | opt.add_option('--translate',action="store_true",help="collect translation strings",dest="trans_qt5",default=False) | |
535 | opt.add_option('--translate',action='store_true',help='collect translation strings',dest='trans_qt5',default=False) |
6 | 6 | from waflib.TaskGen import before_method,feature,extension |
7 | 7 | from waflib.Configure import conf |
8 | 8 | @feature('rubyext') |
9 | @before_method('apply_incpaths','apply_lib_vars','apply_bundle','apply_link') | |
9 | @before_method('apply_incpaths','process_source','apply_bundle','apply_link') | |
10 | 10 | def init_rubyext(self): |
11 | 11 | self.install_path='${ARCHDIR_RUBY}' |
12 | 12 | self.uselib=self.to_list(getattr(self,'uselib','')) |
15 | 15 | if not'RUBYEXT'in self.uselib: |
16 | 16 | self.uselib.append('RUBYEXT') |
17 | 17 | @feature('rubyext') |
18 | @before_method('apply_link','propagate_uselib') | |
18 | @before_method('apply_link','propagate_uselib_vars') | |
19 | 19 | def apply_ruby_so_name(self): |
20 | self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['rubyext_PATTERN'] | |
20 | self.env.cshlib_PATTERN=self.env.cxxshlib_PATTERN=self.env.rubyext_PATTERN | |
21 | 21 | @conf |
22 | 22 | def check_ruby_version(self,minver=()): |
23 | if Options.options.rubybinary: | |
24 | self.env.RUBY=Options.options.rubybinary | |
25 | else: | |
26 | self.find_program('ruby',var='RUBY') | |
27 | ruby=self.env.RUBY | |
23 | ruby=self.find_program('ruby',var='RUBY',value=Options.options.rubybinary) | |
28 | 24 | try: |
29 | 25 | version=self.cmd_and_log(ruby+['-e','puts defined?(VERSION) ? VERSION : RUBY_VERSION']).strip() |
30 | 26 | except Exception: |
16 | 16 | @conf |
17 | 17 | def scc_common_flags(conf): |
18 | 18 | v=conf.env |
19 | v['CC_SRC_F']=[] | |
20 | v['CC_TGT_F']=['-c','-o'] | |
21 | if not v['LINK_CC']:v['LINK_CC']=v['CC'] | |
22 | v['CCLNK_SRC_F']='' | |
23 | v['CCLNK_TGT_F']=['-o'] | |
24 | v['CPPPATH_ST']='-I%s' | |
25 | v['DEFINES_ST']='-D%s' | |
26 | v['LIB_ST']='-l%s' | |
27 | v['LIBPATH_ST']='-L%s' | |
28 | v['STLIB_ST']='-l%s' | |
29 | v['STLIBPATH_ST']='-L%s' | |
30 | v['SONAME_ST']='-Wl,-h,%s' | |
31 | v['SHLIB_MARKER']='-Bdynamic' | |
32 | v['STLIB_MARKER']='-Bstatic' | |
33 | v['cprogram_PATTERN']='%s' | |
34 | v['CFLAGS_cshlib']=['-xcode=pic32','-DPIC'] | |
35 | v['LINKFLAGS_cshlib']=['-G'] | |
36 | v['cshlib_PATTERN']='lib%s.so' | |
37 | v['LINKFLAGS_cstlib']=['-Bstatic'] | |
38 | v['cstlib_PATTERN']='lib%s.a' | |
19 | v.CC_SRC_F=[] | |
20 | v.CC_TGT_F=['-c','-o',''] | |
21 | if not v.LINK_CC: | |
22 | v.LINK_CC=v.CC | |
23 | v.CCLNK_SRC_F='' | |
24 | v.CCLNK_TGT_F=['-o',''] | |
25 | v.CPPPATH_ST='-I%s' | |
26 | v.DEFINES_ST='-D%s' | |
27 | v.LIB_ST='-l%s' | |
28 | v.LIBPATH_ST='-L%s' | |
29 | v.STLIB_ST='-l%s' | |
30 | v.STLIBPATH_ST='-L%s' | |
31 | v.SONAME_ST='-Wl,-h,%s' | |
32 | v.SHLIB_MARKER='-Bdynamic' | |
33 | v.STLIB_MARKER='-Bstatic' | |
34 | v.cprogram_PATTERN='%s' | |
35 | v.CFLAGS_cshlib=['-xcode=pic32','-DPIC'] | |
36 | v.LINKFLAGS_cshlib=['-G'] | |
37 | v.cshlib_PATTERN='lib%s.so' | |
38 | v.LINKFLAGS_cstlib=['-Bstatic'] | |
39 | v.cstlib_PATTERN='lib%s.a' | |
39 | 40 | def configure(conf): |
40 | 41 | conf.find_scc() |
41 | 42 | conf.find_ar() |
16 | 16 | @conf |
17 | 17 | def sxx_common_flags(conf): |
18 | 18 | v=conf.env |
19 | v['CXX_SRC_F']=[] | |
20 | v['CXX_TGT_F']=['-c','-o'] | |
21 | if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] | |
22 | v['CXXLNK_SRC_F']=[] | |
23 | v['CXXLNK_TGT_F']=['-o'] | |
24 | v['CPPPATH_ST']='-I%s' | |
25 | v['DEFINES_ST']='-D%s' | |
26 | v['LIB_ST']='-l%s' | |
27 | v['LIBPATH_ST']='-L%s' | |
28 | v['STLIB_ST']='-l%s' | |
29 | v['STLIBPATH_ST']='-L%s' | |
30 | v['SONAME_ST']='-Wl,-h,%s' | |
31 | v['SHLIB_MARKER']='-Bdynamic' | |
32 | v['STLIB_MARKER']='-Bstatic' | |
33 | v['cxxprogram_PATTERN']='%s' | |
34 | v['CXXFLAGS_cxxshlib']=['-xcode=pic32','-DPIC'] | |
35 | v['LINKFLAGS_cxxshlib']=['-G'] | |
36 | v['cxxshlib_PATTERN']='lib%s.so' | |
37 | v['LINKFLAGS_cxxstlib']=['-Bstatic'] | |
38 | v['cxxstlib_PATTERN']='lib%s.a' | |
19 | v.CXX_SRC_F=[] | |
20 | v.CXX_TGT_F=['-c','-o',''] | |
21 | if not v.LINK_CXX: | |
22 | v.LINK_CXX=v.CXX | |
23 | v.CXXLNK_SRC_F=[] | |
24 | v.CXXLNK_TGT_F=['-o',''] | |
25 | v.CPPPATH_ST='-I%s' | |
26 | v.DEFINES_ST='-D%s' | |
27 | v.LIB_ST='-l%s' | |
28 | v.LIBPATH_ST='-L%s' | |
29 | v.STLIB_ST='-l%s' | |
30 | v.STLIBPATH_ST='-L%s' | |
31 | v.SONAME_ST='-Wl,-h,%s' | |
32 | v.SHLIB_MARKER='-Bdynamic' | |
33 | v.STLIB_MARKER='-Bstatic' | |
34 | v.cxxprogram_PATTERN='%s' | |
35 | v.CXXFLAGS_cxxshlib=['-xcode=pic32','-DPIC'] | |
36 | v.LINKFLAGS_cxxshlib=['-G'] | |
37 | v.cxxshlib_PATTERN='lib%s.so' | |
38 | v.LINKFLAGS_cxxstlib=['-Bstatic'] | |
39 | v.cxxstlib_PATTERN='lib%s.a' | |
39 | 40 | def configure(conf): |
40 | 41 | conf.find_sxx() |
41 | 42 | conf.find_ar() |
14 | 14 | path=match.group('file') |
15 | 15 | if path: |
16 | 16 | for k in('','.bib'): |
17 | Logs.debug('tex: trying %s%s'%(path,k)) | |
17 | Logs.debug('tex: trying %s%s',path,k) | |
18 | 18 | fi=node.parent.find_resource(path+k) |
19 | 19 | if fi: |
20 | 20 | nodes.append(fi) |
21 | 21 | else: |
22 | Logs.debug('tex: could not find %s'%path) | |
23 | Logs.debug("tex: found the following bibunit files: %s"%nodes) | |
22 | Logs.debug('tex: could not find %s',path) | |
23 | Logs.debug('tex: found the following bibunit files: %s',nodes) | |
24 | 24 | return nodes |
25 | 25 | exts_deps_tex=['','.ltx','.tex','.bib','.pdf','.png','.eps','.ps','.sty'] |
26 | 26 | exts_tex=['.ltx','.tex'] |
41 | 41 | Execute the program **makeglossaries** |
42 | 42 | """ |
43 | 43 | def exec_command(self,cmd,**kw): |
44 | bld=self.generator.bld | |
45 | Logs.info('runner: %r'%cmd) | |
46 | try: | |
47 | if not kw.get('cwd',None): | |
48 | kw['cwd']=bld.cwd | |
49 | except AttributeError: | |
50 | bld.cwd=kw['cwd']=bld.variant_dir | |
51 | return Utils.subprocess.Popen(cmd,**kw).wait() | |
44 | if self.env.PROMPT_LATEX: | |
45 | kw['stdout']=kw['stderr']=None | |
46 | return super(tex,self).exec_command(cmd,**kw) | |
52 | 47 | def scan_aux(self,node): |
53 | 48 | nodes=[node] |
54 | 49 | re_aux=re.compile(r'\\@input{(?P<file>[^{}]*)}',re.M) |
58 | 53 | path=match.group('file') |
59 | 54 | found=node.parent.find_or_declare(path) |
60 | 55 | if found and found not in nodes: |
61 | Logs.debug('tex: found aux node '+found.abspath()) | |
56 | Logs.debug('tex: found aux node %r',found) | |
62 | 57 | nodes.append(found) |
63 | 58 | parse_node(found) |
64 | 59 | parse_node(node) |
89 | 84 | found=None |
90 | 85 | for k in exts_deps_tex: |
91 | 86 | for up in self.texinputs_nodes: |
92 | Logs.debug('tex: trying %s%s'%(path,k)) | |
87 | Logs.debug('tex: trying %s%s',path,k) | |
93 | 88 | found=up.find_resource(path+k) |
94 | 89 | if found: |
95 | 90 | break |
113 | 108 | parse_node(node) |
114 | 109 | for x in nodes: |
115 | 110 | x.parent.get_bld().mkdir() |
116 | Logs.debug("tex: found the following : %s and names %s"%(nodes,names)) | |
111 | Logs.debug("tex: found the following : %s and names %s",nodes,names) | |
117 | 112 | return(nodes,names) |
118 | 113 | def check_status(self,msg,retcode): |
119 | 114 | if retcode!=0: |
120 | raise Errors.WafError("%r command exit status %r"%(msg,retcode)) | |
115 | raise Errors.WafError('%r command exit status %r'%(msg,retcode)) | |
116 | def info(self,*k,**kw): | |
117 | try: | |
118 | info=self.generator.bld.conf.logger.info | |
119 | except AttributeError: | |
120 | info=Logs.info | |
121 | info(*k,**kw) | |
121 | 122 | def bibfile(self): |
122 | 123 | for aux_node in self.aux_nodes: |
123 | 124 | try: |
124 | 125 | ct=aux_node.read() |
125 | 126 | except EnvironmentError: |
126 | Logs.error('Error reading %s: %r'%aux_node.abspath()) | |
127 | Logs.error('Error reading %s: %r',aux_node.abspath()) | |
127 | 128 | continue |
128 | 129 | if g_bibtex_re.findall(ct): |
129 | Logs.info('calling bibtex') | |
130 | self.info('calling bibtex') | |
130 | 131 | self.env.env={} |
131 | 132 | self.env.env.update(os.environ) |
132 | 133 | self.env.env.update({'BIBINPUTS':self.texinputs(),'BSTINPUTS':self.texinputs()}) |
147 | 148 | if bibunits: |
148 | 149 | fn=['bu'+str(i)for i in range(1,len(bibunits)+1)] |
149 | 150 | if fn: |
150 | Logs.info('calling bibtex on bibunits') | |
151 | self.info('calling bibtex on bibunits') | |
151 | 152 | for f in fn: |
152 | 153 | self.env.env={'BIBINPUTS':self.texinputs(),'BSTINPUTS':self.texinputs()} |
153 | 154 | self.env.SRCFILE=f |
158 | 159 | idx_path=self.idx_node.abspath() |
159 | 160 | os.stat(idx_path) |
160 | 161 | except OSError: |
161 | Logs.info('index file %s absent, not calling makeindex'%idx_path) | |
162 | self.info('index file %s absent, not calling makeindex',idx_path) | |
162 | 163 | else: |
163 | Logs.info('calling makeindex') | |
164 | self.info('calling makeindex') | |
164 | 165 | self.env.SRCFILE=self.idx_node.name |
165 | 166 | self.env.env={} |
166 | 167 | self.check_status('error when calling makeindex %s'%idx_path,self.makeindex_fun()) |
176 | 177 | try: |
177 | 178 | ct=aux_node.read() |
178 | 179 | except EnvironmentError: |
179 | Logs.error('Error reading %s: %r'%aux_node.abspath()) | |
180 | Logs.error('Error reading %s: %r',aux_node.abspath()) | |
180 | 181 | continue |
181 | 182 | if g_glossaries_re.findall(ct): |
182 | 183 | if not self.env.MAKEGLOSSARIES: |
189 | 190 | return os.pathsep.join([k.abspath()for k in self.texinputs_nodes])+os.pathsep |
190 | 191 | def run(self): |
191 | 192 | env=self.env |
192 | if not env['PROMPT_LATEX']: | |
193 | if not env.PROMPT_LATEX: | |
193 | 194 | env.append_value('LATEXFLAGS','-interaction=batchmode') |
194 | 195 | env.append_value('PDFLATEXFLAGS','-interaction=batchmode') |
195 | 196 | env.append_value('XELATEXFLAGS','-interaction=batchmode') |
196 | self.cwd=self.inputs[0].parent.get_bld().abspath() | |
197 | Logs.info('first pass on %s'%self.__class__.__name__) | |
197 | self.cwd=self.inputs[0].parent.get_bld() | |
198 | self.info('first pass on %s',self.__class__.__name__) | |
198 | 199 | cur_hash=self.hash_aux_nodes() |
199 | 200 | self.call_latex() |
200 | 201 | self.hash_aux_nodes() |
210 | 211 | Logs.error('No aux.h to process') |
211 | 212 | if cur_hash and cur_hash==prev_hash: |
212 | 213 | break |
213 | Logs.info('calling %s'%self.__class__.__name__) | |
214 | self.info('calling %s',self.__class__.__name__) | |
214 | 215 | self.call_latex() |
215 | 216 | def hash_aux_nodes(self): |
216 | 217 | try: |
251 | 252 | if not getattr(self,'type',None)in('latex','pdflatex','xelatex'): |
252 | 253 | self.type='pdflatex' |
253 | 254 | outs=Utils.to_list(getattr(self,'outs',[])) |
254 | self.env['PROMPT_LATEX']=getattr(self,'prompt',1) | |
255 | try: | |
256 | self.generator.bld.conf | |
257 | except AttributeError: | |
258 | default_prompt=False | |
259 | else: | |
260 | default_prompt=True | |
261 | self.env.PROMPT_LATEX=getattr(self,'prompt',default_prompt) | |
255 | 262 | deps_lst=[] |
256 | 263 | if getattr(self,'deps',None): |
257 | 264 | deps=self.to_list(self.deps) |
292 | 299 | if p: |
293 | 300 | task.texinputs_nodes.append(p) |
294 | 301 | else: |
295 | Logs.error('Invalid TEXINPUTS folder %s'%x) | |
302 | Logs.error('Invalid TEXINPUTS folder %s',x) | |
296 | 303 | else: |
297 | Logs.error('Cannot resolve relative paths in TEXINPUTS %s'%x) | |
304 | Logs.error('Cannot resolve relative paths in TEXINPUTS %s',x) | |
298 | 305 | if self.type=='latex': |
299 | 306 | if'ps'in outs: |
300 | 307 | tsk=self.create_task('dvips',task.outputs,node.change_ext('.ps')) |
313 | 320 | self.find_program(p,var=p.upper()) |
314 | 321 | except self.errors.ConfigurationError: |
315 | 322 | pass |
316 | v['DVIPSFLAGS']='-Ppdf' | |
323 | v.DVIPSFLAGS='-Ppdf' |
18 | 18 | if self.generator.dump_deps_node: |
19 | 19 | self.generator.dump_deps_node.write('\n'.join(self.generator.packages)) |
20 | 20 | return ret |
21 | valac=Task.update_outputs(valac) | |
22 | 21 | @taskgen_method |
23 | 22 | def init_vala_task(self): |
24 | 23 | self.profile=getattr(self,'profile','gobject') |
24 | self.packages=packages=Utils.to_list(getattr(self,'packages',[])) | |
25 | self.use=Utils.to_list(getattr(self,'use',[])) | |
26 | if packages and not self.use: | |
27 | self.use=packages[:] | |
25 | 28 | if self.profile=='gobject': |
26 | self.uselib=Utils.to_list(getattr(self,'uselib',[])) | |
27 | if not'GOBJECT'in self.uselib: | |
28 | self.uselib.append('GOBJECT') | |
29 | if not'GOBJECT'in self.use: | |
30 | self.use.append('GOBJECT') | |
29 | 31 | def addflags(flags): |
30 | 32 | self.env.append_value('VALAFLAGS',flags) |
31 | 33 | if self.profile: |
45 | 47 | addflags('--directory=%s'%valatask.vala_dir_node.abspath()) |
46 | 48 | if hasattr(self,'thread'): |
47 | 49 | if self.profile=='gobject': |
48 | if not'GTHREAD'in self.uselib: | |
49 | self.uselib.append('GTHREAD') | |
50 | else: | |
51 | Logs.warn("Profile %s means no threading support"%self.profile) | |
50 | if not'GTHREAD'in self.use: | |
51 | self.use.append('GTHREAD') | |
52 | else: | |
53 | Logs.warn('Profile %s means no threading support',self.profile) | |
52 | 54 | self.thread=False |
53 | 55 | if self.thread: |
54 | 56 | addflags('--thread') |
79 | 81 | api_version=version[0]+".0" |
80 | 82 | return api_version |
81 | 83 | self.includes=Utils.to_list(getattr(self,'includes',[])) |
82 | self.uselib=self.to_list(getattr(self,'uselib',[])) | |
83 | 84 | valatask.install_path=getattr(self,'install_path','') |
84 | 85 | valatask.vapi_path=getattr(self,'vapi_path','${DATAROOTDIR}/vala/vapi') |
85 | valatask.pkg_name=getattr(self,'pkg_name',self.env['PACKAGE']) | |
86 | valatask.pkg_name=getattr(self,'pkg_name',self.env.PACKAGE) | |
86 | 87 | valatask.header_path=getattr(self,'header_path','${INCLUDEDIR}/%s-%s'%(valatask.pkg_name,_get_api_version())) |
87 | 88 | valatask.install_binding=getattr(self,'install_binding',True) |
88 | self.packages=packages=Utils.to_list(getattr(self,'packages',[])) | |
89 | 89 | self.vapi_dirs=vapi_dirs=Utils.to_list(getattr(self,'vapi_dirs',[])) |
90 | 90 | if hasattr(self,'use'): |
91 | 91 | local_packages=Utils.to_list(self.use)[:] |
121 | 121 | else: |
122 | 122 | v_node=self.path.find_dir(vapi_dir) |
123 | 123 | if not v_node: |
124 | Logs.warn('Unable to locate Vala API directory: %r'%vapi_dir) | |
124 | Logs.warn('Unable to locate Vala API directory: %r',vapi_dir) | |
125 | 125 | else: |
126 | 126 | addflags('--vapidir=%s'%v_node.abspath()) |
127 | 127 | self.dump_deps_node=None |
128 | 128 | if self.is_lib and self.packages: |
129 | 129 | self.dump_deps_node=valatask.vala_dir_node.find_or_declare('%s.deps'%self.target) |
130 | 130 | valatask.outputs.append(self.dump_deps_node) |
131 | self.includes.append(self.bld.srcnode.abspath()) | |
132 | self.includes.append(self.bld.bldnode.abspath()) | |
133 | 131 | if self.is_lib and valatask.install_binding: |
134 | 132 | headers_list=[o for o in valatask.outputs if o.suffix()==".h"] |
135 | 133 | try: |
136 | 134 | self.install_vheader.source=headers_list |
137 | 135 | except AttributeError: |
138 | self.install_vheader=self.bld.install_files(valatask.header_path,headers_list,self.env) | |
136 | self.install_vheader=self.add_install_files(install_to=valatask.header_path,install_from=headers_list) | |
139 | 137 | vapi_list=[o for o in valatask.outputs if(o.suffix()in(".vapi",".deps"))] |
140 | 138 | try: |
141 | 139 | self.install_vapi.source=vapi_list |
142 | 140 | except AttributeError: |
143 | self.install_vapi=self.bld.install_files(valatask.vapi_path,vapi_list,self.env) | |
141 | self.install_vapi=self.add_install_files(install_to=valatask.vapi_path,install_from=vapi_list) | |
144 | 142 | gir_list=[o for o in valatask.outputs if o.suffix()=='.gir'] |
145 | 143 | try: |
146 | 144 | self.install_gir.source=gir_list |
147 | 145 | except AttributeError: |
148 | self.install_gir=self.bld.install_files(getattr(self,'gir_path','${DATAROOTDIR}/gir-1.0'),gir_list,self.env) | |
146 | self.install_gir=self.add_install_files(install_to=getattr(self,'gir_path','${DATAROOTDIR}/gir-1.0'),install_from=gir_list) | |
149 | 147 | if hasattr(self,'vala_resources'): |
150 | 148 | nodes=self.to_nodes(self.vala_resources) |
151 | 149 | valatask.vala_exclude=getattr(valatask,'vala_exclude',[])+nodes |
172 | 170 | except Exception: |
173 | 171 | valac_version=None |
174 | 172 | else: |
175 | ver=re.search(r'\d+.\d+.\d+',output).group(0).split('.') | |
173 | ver=re.search(r'\d+.\d+.\d+',output).group().split('.') | |
176 | 174 | valac_version=tuple([int(x)for x in ver]) |
177 | 175 | self.msg('Checking for %s version >= %r'%(valac_name,min_version),valac_version,valac_version and valac_version>=min_version) |
178 | 176 | if valac and valac_version<min_version: |
179 | 177 | self.fatal("%s version %r is too old, need >= %r"%(valac_name,valac_version,min_version)) |
180 | self.env['VALAC_VERSION']=valac_version | |
178 | self.env.VALAC_VERSION=valac_version | |
181 | 179 | return valac |
182 | 180 | @conf |
183 | 181 | def check_vala(self,min_version=(0,8,0),branch=None): |
182 | if self.env.VALA_MINVER: | |
183 | min_version=self.env.VALA_MINVER | |
184 | if self.env.VALA_MINVER_BRANCH: | |
185 | branch=self.env.VALA_MINVER_BRANCH | |
184 | 186 | if not branch: |
185 | 187 | branch=min_version[:2] |
186 | 188 | try: |
189 | 191 | find_valac(self,'valac',min_version) |
190 | 192 | @conf |
191 | 193 | def check_vala_deps(self): |
192 | if not self.env['HAVE_GOBJECT']: | |
194 | if not self.env.HAVE_GOBJECT: | |
193 | 195 | pkg_args={'package':'gobject-2.0','uselib_store':'GOBJECT','args':'--cflags --libs'} |
194 | 196 | if getattr(Options.options,'vala_target_glib',None): |
195 | 197 | pkg_args['atleast_version']=Options.options.vala_target_glib |
196 | 198 | self.check_cfg(**pkg_args) |
197 | if not self.env['HAVE_GTHREAD']: | |
199 | if not self.env.HAVE_GTHREAD: | |
198 | 200 | pkg_args={'package':'gthread-2.0','uselib_store':'GTHREAD','args':'--cflags --libs'} |
199 | 201 | if getattr(Options.options,'vala_target_glib',None): |
200 | 202 | pkg_args['atleast_version']=Options.options.vala_target_glib |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import os | |
4 | import os,sys | |
5 | 5 | from waflib.TaskGen import feature,after_method,taskgen_method |
6 | 6 | from waflib import Utils,Task,Logs,Options |
7 | from waflib.Tools import ccroot | |
7 | 8 | testlock=Utils.threading.Lock() |
9 | SCRIPT_TEMPLATE="""#! %(python)s | |
10 | import subprocess, sys | |
11 | cmd = %(cmd)r | |
12 | # if you want to debug with gdb: | |
13 | #cmd = ['gdb', '-args'] + cmd | |
14 | env = %(env)r | |
15 | status = subprocess.call(cmd, env=env, cwd=%(cwd)r, shell=isinstance(cmd, str)) | |
16 | sys.exit(status) | |
17 | """ | |
8 | 18 | @feature('test') |
9 | @after_method('apply_link') | |
19 | @after_method('apply_link','process_use') | |
10 | 20 | def make_test(self): |
11 | if getattr(self,'link_task',None): | |
12 | self.create_task('utest',self.link_task.outputs) | |
21 | if not getattr(self,'link_task',None): | |
22 | return | |
23 | tsk=self.create_task('utest',self.link_task.outputs) | |
24 | if getattr(self,'ut_str',None): | |
25 | self.ut_run,lst=Task.compile_fun(self.ut_str,shell=getattr(self,'ut_shell',False)) | |
26 | tsk.vars=lst+tsk.vars | |
27 | if getattr(self,'ut_cwd',None): | |
28 | if isinstance(self.ut_cwd,str): | |
29 | if os.path.isabs(self.ut_cwd): | |
30 | self.ut_cwd=self.bld.root.make_node(self.ut_cwd) | |
31 | else: | |
32 | self.ut_cwd=self.path.make_node(self.ut_cwd) | |
33 | else: | |
34 | self.ut_cwd=tsk.inputs[0].parent | |
35 | if not hasattr(self,'ut_paths'): | |
36 | paths=[] | |
37 | for x in self.tmp_use_sorted: | |
38 | try: | |
39 | y=self.bld.get_tgen_by_name(x).link_task | |
40 | except AttributeError: | |
41 | pass | |
42 | else: | |
43 | if not isinstance(y,ccroot.stlink_task): | |
44 | paths.append(y.outputs[0].parent.abspath()) | |
45 | self.ut_paths=os.pathsep.join(paths)+os.pathsep | |
46 | if not hasattr(self,'ut_env'): | |
47 | self.ut_env=dct=dict(os.environ) | |
48 | def add_path(var): | |
49 | dct[var]=self.ut_paths+dct.get(var,'') | |
50 | if Utils.is_win32: | |
51 | add_path('PATH') | |
52 | elif Utils.unversioned_sys_platform()=='darwin': | |
53 | add_path('DYLD_LIBRARY_PATH') | |
54 | add_path('LD_LIBRARY_PATH') | |
55 | else: | |
56 | add_path('LD_LIBRARY_PATH') | |
13 | 57 | @taskgen_method |
14 | 58 | def add_test_results(self,tup): |
15 | 59 | Logs.debug("ut: %r",tup) |
30 | 74 | if getattr(Options.options,'all_tests',False): |
31 | 75 | return Task.RUN_ME |
32 | 76 | return ret |
33 | def add_path(self,dct,path,var): | |
34 | dct[var]=os.pathsep.join(Utils.to_list(path)+[os.environ.get(var,'')]) | |
35 | 77 | def get_test_env(self): |
36 | try: | |
37 | fu=getattr(self.generator.bld,'all_test_paths') | |
38 | except AttributeError: | |
39 | fu=os.environ.copy() | |
40 | lst=[] | |
41 | for g in self.generator.bld.groups: | |
42 | for tg in g: | |
43 | if getattr(tg,'link_task',None): | |
44 | s=tg.link_task.outputs[0].parent.abspath() | |
45 | if s not in lst: | |
46 | lst.append(s) | |
47 | if Utils.is_win32: | |
48 | self.add_path(fu,lst,'PATH') | |
49 | elif Utils.unversioned_sys_platform()=='darwin': | |
50 | self.add_path(fu,lst,'DYLD_LIBRARY_PATH') | |
51 | self.add_path(fu,lst,'LD_LIBRARY_PATH') | |
52 | else: | |
53 | self.add_path(fu,lst,'LD_LIBRARY_PATH') | |
54 | self.generator.bld.all_test_paths=fu | |
55 | return fu | |
78 | return self.generator.ut_env | |
79 | def post_run(self): | |
80 | super(utest,self).post_run() | |
81 | if getattr(Options.options,'clear_failed_tests',False)and self.waf_unit_test_results[1]: | |
82 | self.generator.bld.task_sigs[self.uid()]=None | |
56 | 83 | def run(self): |
57 | filename=self.inputs[0].abspath() | |
58 | self.ut_exec=getattr(self.generator,'ut_exec',[filename]) | |
84 | if hasattr(self.generator,'ut_run'): | |
85 | return self.generator.ut_run(self) | |
86 | self.ut_exec=getattr(self.generator,'ut_exec',[self.inputs[0].abspath()]) | |
59 | 87 | if getattr(self.generator,'ut_fun',None): |
60 | 88 | self.generator.ut_fun(self) |
61 | cwd=getattr(self.generator,'ut_cwd','')or self.inputs[0].parent.abspath() | |
62 | 89 | testcmd=getattr(self.generator,'ut_cmd',False)or getattr(Options.options,'testcmd',False) |
63 | 90 | if testcmd: |
64 | self.ut_exec=(testcmd%" ".join(self.ut_exec)).split(' ') | |
65 | proc=Utils.subprocess.Popen(self.ut_exec,cwd=cwd,env=self.get_test_env(),stderr=Utils.subprocess.PIPE,stdout=Utils.subprocess.PIPE) | |
91 | self.ut_exec=(testcmd%' '.join(self.ut_exec)).split(' ') | |
92 | return self.exec_command(self.ut_exec) | |
93 | def exec_command(self,cmd,**kw): | |
94 | Logs.debug('runner: %r',cmd) | |
95 | if getattr(Options.options,'dump_test_scripts',False): | |
96 | global SCRIPT_TEMPLATE | |
97 | script_code=SCRIPT_TEMPLATE%{'python':sys.executable,'env':self.get_test_env(),'cwd':self.get_cwd().abspath(),'cmd':cmd} | |
98 | script_file=self.inputs[0].abspath()+'_run.py' | |
99 | Utils.writef(script_file,script_code) | |
100 | os.chmod(script_file,Utils.O755) | |
101 | if Logs.verbose>1: | |
102 | Logs.info('Test debug file written as %r'%script_file) | |
103 | proc=Utils.subprocess.Popen(cmd,cwd=self.get_cwd().abspath(),env=self.get_test_env(),stderr=Utils.subprocess.PIPE,stdout=Utils.subprocess.PIPE) | |
66 | 104 | (stdout,stderr)=proc.communicate() |
67 | self.waf_unit_test_results=tup=(filename,proc.returncode,stdout,stderr) | |
105 | self.waf_unit_test_results=tup=(self.inputs[0].abspath(),proc.returncode,stdout,stderr) | |
68 | 106 | testlock.acquire() |
69 | 107 | try: |
70 | 108 | return self.generator.add_test_results(tup) |
71 | 109 | finally: |
72 | 110 | testlock.release() |
73 | def post_run(self): | |
74 | super(utest,self).post_run() | |
75 | if getattr(Options.options,'clear_failed_tests',False)and self.waf_unit_test_results[1]: | |
76 | self.generator.bld.task_sigs[self.uid()]=None | |
111 | def get_cwd(self): | |
112 | return self.generator.ut_cwd | |
77 | 113 | def summary(bld): |
78 | 114 | lst=getattr(bld,'utest_results',[]) |
79 | 115 | if lst: |
103 | 139 | opt.add_option('--alltests',action='store_true',default=False,help='Exec all unit tests',dest='all_tests') |
104 | 140 | opt.add_option('--clear-failed',action='store_true',default=False,help='Force failed unit tests to run again next time',dest='clear_failed_tests') |
105 | 141 | opt.add_option('--testcmd',action='store',default=False,help='Run the unit tests using the test-cmd string'' example "--test-cmd="valgrind --error-exitcode=1'' %s" to run under valgrind',dest='testcmd') |
142 | opt.add_option('--dump-test-scripts',action='store_true',default=False,help='Create python scripts to help debug tests',dest='dump_test_scripts') |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import re,traceback | |
5 | from waflib import Task,Logs,Utils | |
4 | import re | |
5 | from waflib import Task | |
6 | 6 | from waflib.TaskGen import extension |
7 | 7 | from waflib.Tools import c_preproc |
8 | 8 | @extension('.rc') |
9 | 9 | def rc_file(self,node): |
10 | 10 | obj_ext='.rc.o' |
11 | if self.env['WINRC_TGT_F']=='/fo': | |
11 | if self.env.WINRC_TGT_F=='/fo': | |
12 | 12 | obj_ext='.res' |
13 | 13 | rctask=self.create_task('winrc',node,node.change_ext(obj_ext)) |
14 | 14 | try: |
17 | 17 | self.compiled_tasks=[rctask] |
18 | 18 | re_lines=re.compile('(?:^[ \t]*(#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*?)\s*$)|''(?:^\w+[ \t]*(ICON|BITMAP|CURSOR|HTML|FONT|MESSAGETABLE|TYPELIB|REGISTRY|D3DFX)[ \t]*(.*?)\s*$)',re.IGNORECASE|re.MULTILINE) |
19 | 19 | class rc_parser(c_preproc.c_parser): |
20 | def filter_comments(self,filepath): | |
21 | code=Utils.readf(filepath) | |
20 | def filter_comments(self,node): | |
21 | code=node.read() | |
22 | 22 | if c_preproc.use_trigraphs: |
23 | 23 | for(a,b)in c_preproc.trig_def:code=code.split(a).join(b) |
24 | 24 | code=c_preproc.re_nl.sub('',code) |
30 | 30 | else: |
31 | 31 | ret.append(('include',m.group(5))) |
32 | 32 | return ret |
33 | def addlines(self,node): | |
34 | self.currentnode_stack.append(node.parent) | |
35 | filepath=node.abspath() | |
36 | self.count_files+=1 | |
37 | if self.count_files>c_preproc.recursion_limit: | |
38 | raise c_preproc.PreprocError("recursion limit exceeded") | |
39 | pc=self.parse_cache | |
40 | Logs.debug('preproc: reading file %r',filepath) | |
41 | try: | |
42 | lns=pc[filepath] | |
43 | except KeyError: | |
44 | pass | |
45 | else: | |
46 | self.lines.extend(lns) | |
47 | return | |
48 | try: | |
49 | lines=self.filter_comments(filepath) | |
50 | lines.append((c_preproc.POPFILE,'')) | |
51 | lines.reverse() | |
52 | pc[filepath]=lines | |
53 | self.lines.extend(lines) | |
54 | except IOError: | |
55 | raise c_preproc.PreprocError("could not read the file %s"%filepath) | |
56 | except Exception: | |
57 | if Logs.verbose>0: | |
58 | Logs.error("parsing %s failed"%filepath) | |
59 | traceback.print_exc() | |
60 | 33 | class winrc(Task.Task): |
61 | 34 | run_str='${WINRC} ${WINRCFLAGS} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${WINRC_TGT_F} ${TGT} ${WINRC_SRC_F} ${SRC}' |
62 | 35 | color='BLUE' |
63 | 36 | def scan(self): |
64 | 37 | tmp=rc_parser(self.generator.includes_nodes) |
65 | 38 | tmp.start(self.inputs[0],self.env) |
66 | nodes=tmp.nodes | |
67 | names=tmp.names | |
68 | if Logs.verbose: | |
69 | Logs.debug('deps: deps for %s: %r; unresolved %r'%(str(self),nodes,names)) | |
70 | return(nodes,names) | |
39 | return(tmp.nodes,tmp.names) | |
71 | 40 | def configure(conf): |
72 | 41 | v=conf.env |
73 | v['WINRC_TGT_F']='-o' | |
74 | v['WINRC_SRC_F']='-i' | |
75 | if not conf.env.WINRC: | |
42 | v.WINRC_TGT_F='-o' | |
43 | v.WINRC_SRC_F='-i' | |
44 | if not v.WINRC: | |
76 | 45 | if v.CC_NAME=='msvc': |
77 | conf.find_program('RC',var='WINRC',path_list=v['PATH']) | |
78 | v['WINRC_TGT_F']='/fo' | |
79 | v['WINRC_SRC_F']='' | |
46 | conf.find_program('RC',var='WINRC',path_list=v.PATH) | |
47 | v.WINRC_TGT_F='/fo' | |
48 | v.WINRC_SRC_F='' | |
80 | 49 | else: |
81 | conf.find_program('windres',var='WINRC',path_list=v['PATH']) | |
82 | if not conf.env.WINRC: | |
50 | conf.find_program('windres',var='WINRC',path_list=v.PATH) | |
51 | if not v.WINRC: | |
83 | 52 | conf.fatal('winrc was not found!') |
84 | v['WINRCFLAGS']=[] |
11 | 11 | @conf |
12 | 12 | def xlc_common_flags(conf): |
13 | 13 | v=conf.env |
14 | v['CC_SRC_F']=[] | |
15 | v['CC_TGT_F']=['-c','-o'] | |
16 | if not v['LINK_CC']:v['LINK_CC']=v['CC'] | |
17 | v['CCLNK_SRC_F']=[] | |
18 | v['CCLNK_TGT_F']=['-o'] | |
19 | v['CPPPATH_ST']='-I%s' | |
20 | v['DEFINES_ST']='-D%s' | |
21 | v['LIB_ST']='-l%s' | |
22 | v['LIBPATH_ST']='-L%s' | |
23 | v['STLIB_ST']='-l%s' | |
24 | v['STLIBPATH_ST']='-L%s' | |
25 | v['RPATH_ST']='-Wl,-rpath,%s' | |
26 | v['SONAME_ST']=[] | |
27 | v['SHLIB_MARKER']=[] | |
28 | v['STLIB_MARKER']=[] | |
29 | v['LINKFLAGS_cprogram']=['-Wl,-brtl'] | |
30 | v['cprogram_PATTERN']='%s' | |
31 | v['CFLAGS_cshlib']=['-fPIC'] | |
32 | v['LINKFLAGS_cshlib']=['-G','-Wl,-brtl,-bexpfull'] | |
33 | v['cshlib_PATTERN']='lib%s.so' | |
34 | v['LINKFLAGS_cstlib']=[] | |
35 | v['cstlib_PATTERN']='lib%s.a' | |
14 | v.CC_SRC_F=[] | |
15 | v.CC_TGT_F=['-c','-o'] | |
16 | if not v.LINK_CC: | |
17 | v.LINK_CC=v.CC | |
18 | v.CCLNK_SRC_F=[] | |
19 | v.CCLNK_TGT_F=['-o'] | |
20 | v.CPPPATH_ST='-I%s' | |
21 | v.DEFINES_ST='-D%s' | |
22 | v.LIB_ST='-l%s' | |
23 | v.LIBPATH_ST='-L%s' | |
24 | v.STLIB_ST='-l%s' | |
25 | v.STLIBPATH_ST='-L%s' | |
26 | v.RPATH_ST='-Wl,-rpath,%s' | |
27 | v.SONAME_ST=[] | |
28 | v.SHLIB_MARKER=[] | |
29 | v.STLIB_MARKER=[] | |
30 | v.LINKFLAGS_cprogram=['-Wl,-brtl'] | |
31 | v.cprogram_PATTERN='%s' | |
32 | v.CFLAGS_cshlib=['-fPIC'] | |
33 | v.LINKFLAGS_cshlib=['-G','-Wl,-brtl,-bexpfull'] | |
34 | v.cshlib_PATTERN='lib%s.so' | |
35 | v.LINKFLAGS_cstlib=[] | |
36 | v.cstlib_PATTERN='lib%s.a' | |
36 | 37 | def configure(conf): |
37 | 38 | conf.find_xlc() |
38 | 39 | conf.find_ar() |
11 | 11 | @conf |
12 | 12 | def xlcxx_common_flags(conf): |
13 | 13 | v=conf.env |
14 | v['CXX_SRC_F']=[] | |
15 | v['CXX_TGT_F']=['-c','-o'] | |
16 | if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] | |
17 | v['CXXLNK_SRC_F']=[] | |
18 | v['CXXLNK_TGT_F']=['-o'] | |
19 | v['CPPPATH_ST']='-I%s' | |
20 | v['DEFINES_ST']='-D%s' | |
21 | v['LIB_ST']='-l%s' | |
22 | v['LIBPATH_ST']='-L%s' | |
23 | v['STLIB_ST']='-l%s' | |
24 | v['STLIBPATH_ST']='-L%s' | |
25 | v['RPATH_ST']='-Wl,-rpath,%s' | |
26 | v['SONAME_ST']=[] | |
27 | v['SHLIB_MARKER']=[] | |
28 | v['STLIB_MARKER']=[] | |
29 | v['LINKFLAGS_cxxprogram']=['-Wl,-brtl'] | |
30 | v['cxxprogram_PATTERN']='%s' | |
31 | v['CXXFLAGS_cxxshlib']=['-fPIC'] | |
32 | v['LINKFLAGS_cxxshlib']=['-G','-Wl,-brtl,-bexpfull'] | |
33 | v['cxxshlib_PATTERN']='lib%s.so' | |
34 | v['LINKFLAGS_cxxstlib']=[] | |
35 | v['cxxstlib_PATTERN']='lib%s.a' | |
14 | v.CXX_SRC_F=[] | |
15 | v.CXX_TGT_F=['-c','-o'] | |
16 | if not v.LINK_CXX: | |
17 | v.LINK_CXX=v.CXX | |
18 | v.CXXLNK_SRC_F=[] | |
19 | v.CXXLNK_TGT_F=['-o'] | |
20 | v.CPPPATH_ST='-I%s' | |
21 | v.DEFINES_ST='-D%s' | |
22 | v.LIB_ST='-l%s' | |
23 | v.LIBPATH_ST='-L%s' | |
24 | v.STLIB_ST='-l%s' | |
25 | v.STLIBPATH_ST='-L%s' | |
26 | v.RPATH_ST='-Wl,-rpath,%s' | |
27 | v.SONAME_ST=[] | |
28 | v.SHLIB_MARKER=[] | |
29 | v.STLIB_MARKER=[] | |
30 | v.LINKFLAGS_cxxprogram=['-Wl,-brtl'] | |
31 | v.cxxprogram_PATTERN='%s' | |
32 | v.CXXFLAGS_cxxshlib=['-fPIC'] | |
33 | v.LINKFLAGS_cxxshlib=['-G','-Wl,-brtl,-bexpfull'] | |
34 | v.cxxshlib_PATTERN='lib%s.so' | |
35 | v.LINKFLAGS_cxxstlib=[] | |
36 | v.cxxstlib_PATTERN='lib%s.a' | |
36 | 37 | def configure(conf): |
37 | 38 | conf.find_xlcxx() |
38 | 39 | conf.find_ar() |
1 | 1 | # encoding: utf-8 |
2 | 2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file |
3 | 3 | |
4 | import os,sys,errno,traceback,inspect,re,shutil,datetime,gc,platform | |
5 | import subprocess | |
4 | import os,sys,errno,traceback,inspect,re,datetime,platform,base64,signal,functools | |
5 | try: | |
6 | import cPickle | |
7 | except ImportError: | |
8 | import pickle as cPickle | |
9 | if os.name=='posix'and sys.version_info[0]<3: | |
10 | try: | |
11 | import subprocess32 as subprocess | |
12 | except ImportError: | |
13 | import subprocess | |
14 | else: | |
15 | import subprocess | |
16 | try: | |
17 | TimeoutExpired=subprocess.TimeoutExpired | |
18 | except AttributeError: | |
19 | class TimeoutExpired(object): | |
20 | pass | |
6 | 21 | from collections import deque,defaultdict |
7 | 22 | try: |
8 | 23 | import _winreg as winreg |
12 | 27 | except ImportError: |
13 | 28 | winreg=None |
14 | 29 | from waflib import Errors |
15 | try: | |
16 | from collections import UserDict | |
17 | except ImportError: | |
18 | from UserDict import UserDict | |
19 | 30 | try: |
20 | 31 | from hashlib import md5 |
21 | 32 | except ImportError: |
36 | 47 | def release(self): |
37 | 48 | pass |
38 | 49 | threading.Lock=threading.Thread=Lock |
39 | else: | |
40 | run_old=threading.Thread.run | |
41 | def run(*args,**kwargs): | |
42 | try: | |
43 | run_old(*args,**kwargs) | |
44 | except(KeyboardInterrupt,SystemExit): | |
45 | raise | |
46 | except Exception: | |
47 | sys.excepthook(*sys.exc_info()) | |
48 | threading.Thread.run=run | |
49 | SIG_NIL='iluvcuteoverload'.encode() | |
50 | SIG_NIL='SIG_NIL_SIG_NIL_'.encode() | |
50 | 51 | O644=420 |
51 | 52 | O755=493 |
52 | 53 | rot_chr=['\\','|','/','-'] |
53 | 54 | rot_idx=0 |
54 | try: | |
55 | from collections import OrderedDict as ordered_iter_dict | |
56 | except ImportError: | |
57 | class ordered_iter_dict(dict): | |
58 | def __init__(self,*k,**kw): | |
59 | self.lst=[] | |
60 | dict.__init__(self,*k,**kw) | |
61 | def clear(self): | |
62 | dict.clear(self) | |
63 | self.lst=[] | |
64 | def __setitem__(self,key,value): | |
65 | dict.__setitem__(self,key,value) | |
66 | try: | |
67 | self.lst.remove(key) | |
68 | except ValueError: | |
69 | pass | |
70 | self.lst.append(key) | |
71 | def __delitem__(self,key): | |
72 | dict.__delitem__(self,key) | |
73 | try: | |
74 | self.lst.remove(key) | |
75 | except ValueError: | |
76 | pass | |
77 | def __iter__(self): | |
78 | for x in self.lst: | |
79 | yield x | |
80 | def keys(self): | |
81 | return self.lst | |
55 | class ordered_iter_dict(dict): | |
56 | def __init__(self,*k,**kw): | |
57 | self.lst=deque() | |
58 | dict.__init__(self,*k,**kw) | |
59 | def clear(self): | |
60 | dict.clear(self) | |
61 | self.lst=deque() | |
62 | def __setitem__(self,key,value): | |
63 | if key in dict.keys(self): | |
64 | self.lst.remove(key) | |
65 | dict.__setitem__(self,key,value) | |
66 | self.lst.append(key) | |
67 | def __delitem__(self,key): | |
68 | dict.__delitem__(self,key) | |
69 | try: | |
70 | self.lst.remove(key) | |
71 | except ValueError: | |
72 | pass | |
73 | def __iter__(self): | |
74 | return reversed(self.lst) | |
75 | def keys(self): | |
76 | return reversed(self.lst) | |
77 | class lru_node(object): | |
78 | __slots__=('next','prev','key','val') | |
79 | def __init__(self): | |
80 | self.next=self | |
81 | self.prev=self | |
82 | self.key=None | |
83 | self.val=None | |
84 | class lru_cache(object): | |
85 | __slots__=('maxlen','table','head') | |
86 | def __init__(self,maxlen=100): | |
87 | self.maxlen=maxlen | |
88 | self.table={} | |
89 | self.head=lru_node() | |
90 | self.head.next=self.head | |
91 | self.head.prev=self.head | |
92 | def __getitem__(self,key): | |
93 | node=self.table[key] | |
94 | if node is self.head: | |
95 | return node.val | |
96 | node.prev.next=node.next | |
97 | node.next.prev=node.prev | |
98 | node.next=self.head.next | |
99 | node.prev=self.head | |
100 | self.head=node.next.prev=node.prev.next=node | |
101 | return node.val | |
102 | def __setitem__(self,key,val): | |
103 | if key in self.table: | |
104 | node=self.table[key] | |
105 | node.val=val | |
106 | self.__getitem__(key) | |
107 | else: | |
108 | if len(self.table)<self.maxlen: | |
109 | node=lru_node() | |
110 | node.prev=self.head | |
111 | node.next=self.head.next | |
112 | node.prev.next=node.next.prev=node | |
113 | else: | |
114 | node=self.head=self.head.next | |
115 | try: | |
116 | del self.table[node.key] | |
117 | except KeyError: | |
118 | pass | |
119 | node.key=key | |
120 | node.val=val | |
121 | self.table[key]=node | |
82 | 122 | is_win32=os.sep=='\\'or sys.platform=='win32' |
83 | 123 | def readf(fname,m='r',encoding='ISO8859-1'): |
84 | 124 | if sys.hexversion>0x3000000 and not'b'in m: |
158 | 198 | try: |
159 | 199 | fd=os.open(f,flags) |
160 | 200 | except OSError: |
161 | raise IOError('Cannot write to %r'%f) | |
201 | raise OSError('Cannot write to %r'%f) | |
162 | 202 | f=os.fdopen(fd,m) |
163 | 203 | try: |
164 | 204 | f.write(data) |
168 | 208 | try: |
169 | 209 | fd=os.open(fname,os.O_BINARY|os.O_RDONLY|os.O_NOINHERIT) |
170 | 210 | except OSError: |
171 | raise IOError('Cannot read from %r'%fname) | |
211 | raise OSError('Cannot read from %r'%fname) | |
172 | 212 | f=os.fdopen(fd,'rb') |
173 | 213 | m=md5() |
174 | 214 | try: |
208 | 248 | try: |
209 | 249 | import ctypes |
210 | 250 | except ImportError: |
211 | return[x+':\\'for x in list('ABCDEFGHIJKLMNOPQRSTUVWXYZ')] | |
251 | return[x+':\\'for x in'ABCDEFGHIJKLMNOPQRSTUVWXYZ'] | |
212 | 252 | else: |
213 | 253 | dlen=4 |
214 | 254 | maxdrives=26 |
236 | 276 | return ret |
237 | 277 | return ver |
238 | 278 | def ex_stack(): |
239 | exc_type,exc_value,tb=sys.exc_info() | |
240 | exc_lines=traceback.format_exception(exc_type,exc_value,tb) | |
241 | return''.join(exc_lines) | |
242 | def to_list(sth): | |
243 | if isinstance(sth,str): | |
244 | return sth.split() | |
245 | else: | |
246 | return sth | |
279 | return traceback.format_exc() | |
280 | def to_list(val): | |
281 | if isinstance(val,str): | |
282 | return val.split() | |
283 | else: | |
284 | return val | |
247 | 285 | def split_path_unix(path): |
248 | 286 | return path.split('/') |
249 | 287 | def split_path_cygwin(path): |
252 | 290 | ret[0]='/'+ret[0] |
253 | 291 | return ret |
254 | 292 | return path.split('/') |
255 | re_sp=re.compile('[/\\\\]') | |
293 | re_sp=re.compile('[/\\\\]+') | |
256 | 294 | def split_path_win32(path): |
257 | 295 | if path.startswith('\\\\'): |
258 | ret=re.split(re_sp,path)[2:] | |
296 | ret=re_sp.split(path)[2:] | |
259 | 297 | ret[0]='\\'+ret[0] |
260 | 298 | return ret |
261 | return re.split(re_sp,path) | |
299 | return re_sp.split(path) | |
262 | 300 | msysroot=None |
263 | 301 | def split_path_msys(path): |
264 | if(path.startswith('/')or path.startswith('\\'))and not path.startswith('//')and not path.startswith('\\\\'): | |
302 | if path.startswith(('/','\\'))and not path.startswith(('\\','\\\\')): | |
265 | 303 | global msysroot |
266 | 304 | if not msysroot: |
267 | 305 | msysroot=subprocess.check_output(['cygpath','-w','/']).decode(sys.stdout.encoding or'iso8859-1') |
271 | 309 | if sys.platform=='cygwin': |
272 | 310 | split_path=split_path_cygwin |
273 | 311 | elif is_win32: |
274 | if os.environ.get('MSYSTEM',None): | |
312 | if os.environ.get('MSYSTEM'): | |
275 | 313 | split_path=split_path_msys |
276 | 314 | else: |
277 | 315 | split_path=split_path_win32 |
278 | 316 | else: |
279 | 317 | split_path=split_path_unix |
280 | 318 | split_path.__doc__=""" |
281 | Split a path by / or \\. This function is not like os.path.split | |
319 | Splits a path by / or \\; do not confuse this function with with ``os.path.split`` | |
282 | 320 | |
283 | 321 | :type path: string |
284 | 322 | :param path: path to split |
285 | :return: list of strings | |
323 | :return: list of string | |
286 | 324 | """ |
287 | 325 | def check_dir(path): |
288 | 326 | if not os.path.isdir(path): |
301 | 339 | return os.path.abspath(name) |
302 | 340 | else: |
303 | 341 | env=env or os.environ |
304 | for path in env["PATH"].split(os.pathsep): | |
342 | for path in env['PATH'].split(os.pathsep): | |
305 | 343 | path=path.strip('"') |
306 | 344 | exe_file=os.path.join(path,name) |
307 | 345 | if is_exe(exe_file): |
317 | 355 | fu=fu.upper() |
318 | 356 | return fu |
319 | 357 | def h_list(lst): |
320 | m=md5() | |
321 | m.update(str(lst).encode()) | |
322 | return m.digest() | |
358 | return md5(repr(lst).encode()).digest() | |
323 | 359 | def h_fun(fun): |
324 | 360 | try: |
325 | 361 | return fun.code |
326 | 362 | except AttributeError: |
363 | if isinstance(fun,functools.partial): | |
364 | code=list(fun.args) | |
365 | code.extend(sorted(fun.keywords.items())) | |
366 | code.append(h_fun(fun.func)) | |
367 | fun.code=h_list(code) | |
368 | return fun.code | |
327 | 369 | try: |
328 | 370 | h=inspect.getsource(fun) |
329 | except IOError: | |
330 | h="nocode" | |
371 | except EnvironmentError: | |
372 | h='nocode' | |
331 | 373 | try: |
332 | 374 | fun.code=h |
333 | 375 | except AttributeError: |
403 | 445 | if days or hours or minutes: |
404 | 446 | result+='%dm'%minutes |
405 | 447 | return'%s%.3fs'%(result,seconds) |
406 | if is_win32: | |
407 | old=shutil.copy2 | |
408 | def copy2(src,dst): | |
409 | old(src,dst) | |
410 | shutil.copystat(src,dst) | |
411 | setattr(shutil,'copy2',copy2) | |
412 | if os.name=='java': | |
413 | try: | |
414 | gc.disable() | |
415 | gc.enable() | |
416 | except NotImplementedError: | |
417 | gc.disable=gc.enable | |
418 | 448 | def read_la_file(path): |
419 | 449 | sp=re.compile(r'^([^=]+)=\'(.*)\'$') |
420 | 450 | dc={} |
425 | 455 | except ValueError: |
426 | 456 | pass |
427 | 457 | return dc |
428 | def nogc(fun): | |
429 | def f(*k,**kw): | |
430 | try: | |
431 | gc.disable() | |
432 | ret=fun(*k,**kw) | |
433 | finally: | |
434 | gc.enable() | |
435 | return ret | |
436 | f.__doc__=fun.__doc__ | |
437 | return f | |
438 | 458 | def run_once(fun): |
439 | 459 | cache={} |
440 | def wrap(k): | |
460 | def wrap(*k): | |
441 | 461 | try: |
442 | 462 | return cache[k] |
443 | 463 | except KeyError: |
444 | ret=fun(k) | |
464 | ret=fun(*k) | |
445 | 465 | cache[k]=ret |
446 | 466 | return ret |
447 | 467 | wrap.__cache__=cache |
465 | 485 | return'' |
466 | 486 | def sane_path(p): |
467 | 487 | return os.path.abspath(os.path.expanduser(p)) |
488 | process_pool=[] | |
489 | def get_process(): | |
490 | try: | |
491 | return process_pool.pop() | |
492 | except IndexError: | |
493 | filepath=os.path.dirname(os.path.abspath(__file__))+os.sep+'processor.py' | |
494 | cmd=[sys.executable,'-c',readf(filepath)] | |
495 | return subprocess.Popen(cmd,stdout=subprocess.PIPE,stdin=subprocess.PIPE,bufsize=0) | |
496 | def run_prefork_process(cmd,kwargs,cargs): | |
497 | if not'env'in kwargs: | |
498 | kwargs['env']=dict(os.environ) | |
499 | try: | |
500 | obj=base64.b64encode(cPickle.dumps([cmd,kwargs,cargs])) | |
501 | except TypeError: | |
502 | return run_regular_process(cmd,kwargs,cargs) | |
503 | proc=get_process() | |
504 | if not proc: | |
505 | return run_regular_process(cmd,kwargs,cargs) | |
506 | proc.stdin.write(obj) | |
507 | proc.stdin.write('\n'.encode()) | |
508 | proc.stdin.flush() | |
509 | obj=proc.stdout.readline() | |
510 | if not obj: | |
511 | raise OSError('Preforked sub-process %r died'%proc.pid) | |
512 | process_pool.append(proc) | |
513 | ret,out,err,ex,trace=cPickle.loads(base64.b64decode(obj)) | |
514 | if ex: | |
515 | if ex=='OSError': | |
516 | raise OSError(trace) | |
517 | elif ex=='ValueError': | |
518 | raise ValueError(trace) | |
519 | elif ex=='TimeoutExpired': | |
520 | exc=TimeoutExpired(cmd,timeout=cargs['timeout'],output=out) | |
521 | exc.stderr=err | |
522 | raise exc | |
523 | else: | |
524 | raise Exception(trace) | |
525 | return ret,out,err | |
526 | def lchown(path,user=-1,group=-1): | |
527 | if isinstance(user,str): | |
528 | import pwd | |
529 | entry=pwd.getpwnam(user) | |
530 | if not entry: | |
531 | raise OSError('Unknown user %r'%user) | |
532 | user=entry[2] | |
533 | if isinstance(group,str): | |
534 | import grp | |
535 | entry=grp.getgrnam(group) | |
536 | if not entry: | |
537 | raise OSError('Unknown group %r'%group) | |
538 | group=entry[2] | |
539 | return os.lchown(path,user,group) | |
540 | def run_regular_process(cmd,kwargs,cargs={}): | |
541 | proc=subprocess.Popen(cmd,**kwargs) | |
542 | if kwargs.get('stdout')or kwargs.get('stderr'): | |
543 | try: | |
544 | out,err=proc.communicate(**cargs) | |
545 | except TimeoutExpired: | |
546 | if kwargs.get('start_new_session')and hasattr(os,'killpg'): | |
547 | os.killpg(proc.pid,signal.SIGKILL) | |
548 | else: | |
549 | proc.kill() | |
550 | out,err=proc.communicate() | |
551 | exc=TimeoutExpired(proc.args,timeout=cargs['timeout'],output=out) | |
552 | exc.stderr=err | |
553 | raise exc | |
554 | status=proc.returncode | |
555 | else: | |
556 | out,err=(None,None) | |
557 | try: | |
558 | status=proc.wait(**cargs) | |
559 | except TimeoutExpired as e: | |
560 | if kwargs.get('start_new_session')and hasattr(os,'killpg'): | |
561 | os.killpg(proc.pid,signal.SIGKILL) | |
562 | else: | |
563 | proc.kill() | |
564 | proc.wait() | |
565 | raise e | |
566 | return status,out,err | |
567 | def run_process(cmd,kwargs,cargs={}): | |
568 | if kwargs.get('stdout')and kwargs.get('stderr'): | |
569 | return run_prefork_process(cmd,kwargs,cargs) | |
570 | else: | |
571 | return run_regular_process(cmd,kwargs,cargs) | |
572 | def alloc_process_pool(n,force=False): | |
573 | global run_process,get_process,alloc_process_pool | |
574 | if not force: | |
575 | n=max(n-len(process_pool),0) | |
576 | try: | |
577 | lst=[get_process()for x in range(n)] | |
578 | except OSError: | |
579 | run_process=run_regular_process | |
580 | get_process=alloc_process_pool=nada | |
581 | else: | |
582 | for x in lst: | |
583 | process_pool.append(x) | |
584 | if sys.platform=='cli'or not sys.executable: | |
585 | run_process=run_regular_process | |
586 | get_process=alloc_process_pool=nada |
81 | 81 | self.all_envs[name]=env |
82 | 82 | else: |
83 | 83 | if fromenv: |
84 | Logs.warn("The environment %s may have been configured already"%name) | |
84 | Logs.warn('The environment %s may have been configured already',name) | |
85 | 85 | return env |
86 | 86 | Configure.ConfigurationContext.retrieve=retrieve |
87 | 87 | Configure.ConfigurationContext.sub_config=Configure.ConfigurationContext.recurse |
134 | 134 | ret=rev(path,encoding) |
135 | 135 | if'set_options'in ret.__dict__: |
136 | 136 | if Logs.verbose: |
137 | Logs.warn('compat: rename "set_options" to "options" (%r)'%path) | |
137 | Logs.warn('compat: rename "set_options" to "options" (%r)',path) | |
138 | 138 | ret.options=ret.set_options |
139 | 139 | if'srcdir'in ret.__dict__: |
140 | 140 | if Logs.verbose: |
141 | Logs.warn('compat: rename "srcdir" to "top" (%r)'%path) | |
141 | Logs.warn('compat: rename "srcdir" to "top" (%r)',path) | |
142 | 142 | ret.top=ret.srcdir |
143 | 143 | if'blddir'in ret.__dict__: |
144 | 144 | if Logs.verbose: |
145 | Logs.warn('compat: rename "blddir" to "out" (%r)'%path) | |
145 | Logs.warn('compat: rename "blddir" to "out" (%r)',path) | |
146 | 146 | ret.out=ret.blddir |
147 | 147 | Utils.g_module=Context.g_module |
148 | 148 | Options.launch_dir=Context.launch_dir |
181 | 181 | self.includes=self.to_list(getattr(self,'includes',[])) |
182 | 182 | names=self.to_list(getattr(self,'uselib_local',[])) |
183 | 183 | get=self.bld.get_tgen_by_name |
184 | seen=set([]) | |
185 | seen_uselib=set([]) | |
184 | seen=set() | |
185 | seen_uselib=set() | |
186 | 186 | tmp=Utils.deque(names) |
187 | 187 | if tmp: |
188 | 188 | if Logs.verbose: |
281 | 281 | return[] |
282 | 282 | destpath=Utils.subst_vars(path,self.env) |
283 | 283 | if self.is_install>0: |
284 | Logs.info('* creating %s'%destpath) | |
284 | Logs.info('* creating %s',destpath) | |
285 | 285 | Utils.check_dir(destpath) |
286 | 286 | elif self.is_install<0: |
287 | Logs.info('* removing %s'%destpath) | |
287 | Logs.info('* removing %s',destpath) | |
288 | 288 | try: |
289 | 289 | os.remove(destpath) |
290 | 290 | except OSError: |
43 | 43 | @subst('*') |
44 | 44 | def r1(code): |
45 | 45 | code=code.replace('as e:',',e:') |
46 | code=code.replace(".decode(sys.stdout.encoding or 'iso8859-1')",'') | |
47 | code=code.replace('.encode()','') | |
48 | return code | |
46 | code=code.replace(".decode(sys.stdout.encoding or'iso8859-1',errors='replace')",'') | |
47 | return code.replace('.encode()','') | |
49 | 48 | @subst('Runner.py') |
50 | 49 | def r4(code): |
51 | code=code.replace('next(self.biter)','self.biter.next()') | |
52 | return code | |
50 | return code.replace('next(self.biter)','self.biter.next()') | |
51 | @subst('Context.py') | |
52 | def r5(code): | |
53 | return code.replace("('Execution failure: %s'%str(e),ex=e)","('Execution failure: %s'%str(e),ex=e),None,sys.exc_info()[2]") |
0 | #! /usr/bin/env python | |
1 | # encoding: utf-8 | |
2 | # WARNING! Do not edit! https://waf.io/book/index.html#_obtaining_the_waf_file | |
3 | ||
4 | import os,sys,traceback,base64,signal | |
5 | try: | |
6 | import cPickle | |
7 | except ImportError: | |
8 | import pickle as cPickle | |
9 | try: | |
10 | import subprocess32 as subprocess | |
11 | except ImportError: | |
12 | import subprocess | |
13 | try: | |
14 | TimeoutExpired=subprocess.TimeoutExpired | |
15 | except AttributeError: | |
16 | class TimeoutExpired(object): | |
17 | pass | |
18 | def run(): | |
19 | txt=sys.stdin.readline().strip() | |
20 | if not txt: | |
21 | sys.exit(1) | |
22 | [cmd,kwargs,cargs]=cPickle.loads(base64.b64decode(txt)) | |
23 | cargs=cargs or{} | |
24 | ret=1 | |
25 | out,err,ex,trace=(None,None,None,None) | |
26 | try: | |
27 | proc=subprocess.Popen(cmd,**kwargs) | |
28 | try: | |
29 | out,err=proc.communicate(**cargs) | |
30 | except TimeoutExpired: | |
31 | if kwargs.get('start_new_session')and hasattr(os,'killpg'): | |
32 | os.killpg(proc.pid,signal.SIGKILL) | |
33 | else: | |
34 | proc.kill() | |
35 | out,err=proc.communicate() | |
36 | exc=TimeoutExpired(proc.args,timeout=cargs['timeout'],output=out) | |
37 | exc.stderr=err | |
38 | raise exc | |
39 | ret=proc.returncode | |
40 | except Exception as e: | |
41 | exc_type,exc_value,tb=sys.exc_info() | |
42 | exc_lines=traceback.format_exception(exc_type,exc_value,tb) | |
43 | trace=str(cmd)+'\n'+''.join(exc_lines) | |
44 | ex=e.__class__.__name__ | |
45 | tmp=[ret,out,err,ex,trace] | |
46 | obj=base64.b64encode(cPickle.dumps(tmp)) | |
47 | sys.stdout.write(obj.decode()) | |
48 | sys.stdout.write('\n') | |
49 | sys.stdout.flush() | |
50 | while 1: | |
51 | try: | |
52 | run() | |
53 | except KeyboardInterrupt: | |
54 | break |
13 | 13 | |
14 | 14 | APPNAME = 'aubio' |
15 | 15 | |
16 | # source VERSION | |
17 | for l in open('VERSION').readlines(): exec (l.strip()) | |
18 | ||
19 | VERSION = '.'.join ([str(x) for x in [ | |
20 | AUBIO_MAJOR_VERSION, | |
21 | AUBIO_MINOR_VERSION, | |
22 | AUBIO_PATCH_VERSION | |
23 | ]]) + AUBIO_VERSION_STATUS | |
24 | ||
25 | LIB_VERSION = '.'.join ([str(x) for x in [ | |
26 | LIBAUBIO_LT_CUR, | |
27 | LIBAUBIO_LT_REV, | |
28 | LIBAUBIO_LT_AGE]]) | |
16 | from this_version import * | |
17 | ||
18 | VERSION = get_aubio_version() | |
19 | LIB_VERSION = get_libaubio_version() | |
29 | 20 | |
30 | 21 | top = '.' |
31 | 22 | out = 'build' |
258 | 249 | if (ctx.options.enable_fftw3 != False or ctx.options.enable_fftw3f != False): |
259 | 250 | # one of fftwf or fftw3f |
260 | 251 | if (ctx.options.enable_fftw3f != False): |
261 | ctx.check_cfg(package = 'fftw3f', atleast_version = '3.0.0', | |
262 | args = '--cflags --libs', | |
252 | ctx.check_cfg(package = 'fftw3f', | |
253 | args = '--cflags --libs fftw3f >= 3.0.0', | |
263 | 254 | mandatory = ctx.options.enable_fftw3f) |
264 | 255 | if (ctx.options.enable_double == True): |
265 | 256 | ctx.msg('Warning', |
268 | 259 | # fftw3f disabled, take most sensible one according to |
269 | 260 | # enable_double |
270 | 261 | if (ctx.options.enable_double == True): |
271 | ctx.check_cfg(package = 'fftw3', atleast_version = '3.0.0', | |
272 | args = '--cflags --libs', mandatory = | |
273 | ctx.options.enable_fftw3) | |
262 | ctx.check_cfg(package = 'fftw3', | |
263 | args = '--cflags --libs fftw3 >= 3.0.0.', | |
264 | mandatory = ctx.options.enable_fftw3) | |
274 | 265 | else: |
275 | ctx.check_cfg(package = 'fftw3f', atleast_version = '3.0.0', | |
276 | args = '--cflags --libs', | |
266 | ctx.check_cfg(package = 'fftw3f', | |
267 | args = '--cflags --libs fftw3f >= 3.0.0', | |
277 | 268 | mandatory = ctx.options.enable_fftw3) |
278 | 269 | ctx.define('HAVE_FFTW3', 1) |
279 | 270 | |
289 | 280 | |
290 | 281 | # check for libsndfile |
291 | 282 | if (ctx.options.enable_sndfile != False): |
292 | ctx.check_cfg(package = 'sndfile', atleast_version = '1.0.4', | |
293 | args = '--cflags --libs', | |
283 | ctx.check_cfg(package = 'sndfile', | |
284 | args = '--cflags --libs sndfile >= 1.0.4', | |
294 | 285 | mandatory = ctx.options.enable_sndfile) |
295 | 286 | |
296 | 287 | # check for libsamplerate |
302 | 293 | ctx.msg('Checking if using samplerate', 'no (disabled in double precision mode)', |
303 | 294 | color = 'YELLOW') |
304 | 295 | if (ctx.options.enable_samplerate != False): |
305 | ctx.check_cfg(package = 'samplerate', atleast_version = '0.0.15', | |
306 | args = '--cflags --libs', | |
296 | ctx.check_cfg(package = 'samplerate', | |
297 | args = '--cflags --libs samplerate >= 0.0.15', | |
307 | 298 | mandatory = ctx.options.enable_samplerate) |
308 | 299 | |
309 | 300 | # check for jack |
314 | 305 | |
315 | 306 | # check for libav |
316 | 307 | if (ctx.options.enable_avcodec != False): |
317 | ctx.check_cfg(package = 'libavcodec', atleast_version = '54.35.0', | |
318 | args = '--cflags --libs', uselib_store = 'AVCODEC', | |
308 | ctx.check_cfg(package = 'libavcodec', | |
309 | args = '--cflags --libs libavcodec >= 54.35.0', | |
310 | uselib_store = 'AVCODEC', | |
319 | 311 | mandatory = ctx.options.enable_avcodec) |
320 | ctx.check_cfg(package = 'libavformat', atleast_version = '52.3.0', | |
321 | args = '--cflags --libs', uselib_store = 'AVFORMAT', | |
312 | ctx.check_cfg(package = 'libavformat', | |
313 | args = '--cflags --libs libavformat >= 52.3.0', | |
314 | uselib_store = 'AVFORMAT', | |
322 | 315 | mandatory = ctx.options.enable_avcodec) |
323 | ctx.check_cfg(package = 'libavutil', atleast_version = '52.3.0', | |
324 | args = '--cflags --libs', uselib_store = 'AVUTIL', | |
316 | ctx.check_cfg(package = 'libavutil', | |
317 | args = '--cflags --libs libavutil >= 52.3.0', | |
318 | uselib_store = 'AVUTIL', | |
325 | 319 | mandatory = ctx.options.enable_avcodec) |
326 | ctx.check_cfg(package = 'libavresample', atleast_version = '1.0.1', | |
327 | args = '--cflags --libs', uselib_store = 'AVRESAMPLE', | |
328 | mandatory = ctx.options.enable_avcodec) | |
329 | if all ( 'HAVE_' + i in ctx.env | |
330 | for i in ['AVCODEC', 'AVFORMAT', 'AVUTIL', 'AVRESAMPLE'] ): | |
320 | ctx.check_cfg(package = 'libswresample', | |
321 | args = '--cflags --libs libswresample >= 1.2.0', | |
322 | uselib_store = 'SWRESAMPLE', | |
323 | mandatory = False) | |
324 | if 'HAVE_SWRESAMPLE' not in ctx.env: | |
325 | ctx.check_cfg(package = 'libavresample', | |
326 | args = '--cflags --libs libavresample >= 1.0.1', | |
327 | uselib_store = 'AVRESAMPLE', | |
328 | mandatory = False) | |
329 | ||
330 | msg_check = 'Checking for all libav libraries' | |
331 | if 'HAVE_AVCODEC' not in ctx.env: | |
332 | ctx.msg(msg_check, 'not found (missing avcodec)', color = 'YELLOW') | |
333 | elif 'HAVE_AVFORMAT' not in ctx.env: | |
334 | ctx.msg(msg_check, 'not found (missing avformat)', color = 'YELLOW') | |
335 | elif 'HAVE_AVUTIL' not in ctx.env: | |
336 | ctx.msg(msg_check, 'not found (missing avutil)', color = 'YELLOW') | |
337 | elif 'HAVE_SWRESAMPLE' not in ctx.env and 'HAVE_AVRESAMPLE' not in ctx.env: | |
338 | resample_missing = 'not found (avresample or swresample required)' | |
339 | ctx.msg(msg_check, resample_missing, color = 'YELLOW') | |
340 | else: | |
341 | ctx.msg(msg_check, 'yes') | |
342 | if 'HAVE_SWRESAMPLE' in ctx.env: | |
343 | ctx.define('HAVE_SWRESAMPLE', 1) | |
344 | elif 'HAVE_AVRESAMPLE' in ctx.env: | |
345 | ctx.define('HAVE_AVRESAMPLE', 1) | |
331 | 346 | ctx.define('HAVE_LIBAV', 1) |
332 | ctx.msg('Checking for all libav libraries', 'yes') | |
333 | else: | |
334 | ctx.msg('Checking for all libav libraries', 'not found', color = 'YELLOW') | |
335 | 347 | |
336 | 348 | if (ctx.options.enable_wavread != False): |
337 | 349 | ctx.define('HAVE_WAVREAD', 1) |
428 | 440 | def doxygen(bld): |
429 | 441 | # build documentation from source files using doxygen |
430 | 442 | if bld.env['DOXYGEN']: |
431 | bld( name = 'doxygen', rule = 'doxygen ${SRC} > /dev/null', | |
443 | bld.env.VERSION = VERSION | |
444 | rule = '( cat ${SRC} && echo PROJECT_NUMBER=${VERSION}; )' | |
445 | rule += ' | doxygen - > /dev/null' | |
446 | bld( name = 'doxygen', rule = rule, | |
432 | 447 | source = 'doc/web.cfg', |
433 | 448 | target = '../doc/web/html/index.html', |
434 | 449 | cwd = 'doc') |