New upstream version 4.2.6
Mathieu Malaterre
2 years ago
43 | 43 | #vim files |
44 | 44 | .*.sw? |
45 | 45 | .*~ |
46 | ||
47 | # VS Code | |
48 | .vscode/ | |
49 | ||
50 | # Web site | |
51 | **/_site | |
52 | **/.cache | |
53 | **/node_modules⏎ |
6 | 6 | - oraclejdk9 |
7 | 7 | - oraclejdk8 |
8 | 8 | - openjdk7 |
9 | cache: | |
10 | directories: | |
11 | - $HOME/.m2 | |
9 | 12 | before_install: .travis/before_install.sh |
10 | 13 | script: mvn verify |
11 | 14 | after_success: .travis/after_success.sh |
0 | 0 | # EPUBCheck change log |
1 | ||
2 | <a name="4.2.6"></a> | |
3 | ## [4.2.6](https://github.com/w3c/epubcheck/compare/v4.2.5...v4.2.6) (2021-06-30) | |
4 | ||
5 | This is the **latest production-ready** release of EPUBCheck. It provides complete support for checking conformance to the [EPUB 3.2](https://w3c.github.io/publ-epub-revision/epub32/spec/epub-spec.html) family of specifications. | |
6 | ||
7 | Version 4.2.6 is a maintenance release, which reverts two checks recently introduced in v4.2.5. In this newer version: | |
8 | - the `role` property can now refine `creator`, `contributor`, and `publisher` properties more than once | |
9 | - Media Overlays no longer have to match the reading order of the associated Content Documents | |
10 | ||
11 | Reverting these checks is a proactive adoption of the rules that will be relaxed in EPUB 3.3, as the stricter checks recently introduced were problematic to some users. | |
12 | ||
13 | This release was made by the DAISY Consortium for the W3C. Many thanks to everyone who contributed and reported issues! | |
14 | ||
15 | This EPUBCheck version is also available in the Maven Central Repository as [`org.w3c:epubcheck:4.2.6`](https://search.maven.org/artifact/org.w3c/epubcheck/4.2.6/jar). | |
16 | ||
17 | ### Features | |
18 | ||
19 | * allow multiple roles for creator, contributor, and publisher ([#1258](https://github.com/w3c/epubcheck/issues/1258)) ([6c68c61](https://github.com/w3c/epubcheck/commit/6c68c61)), closes [#1230](https://github.com/w3c/epubcheck/issues/1230) | |
20 | * do not report Media Overlays ordering mismatch ([1cd7d77](https://github.com/w3c/epubcheck/commit/1cd7d77)) | |
21 | ||
22 | ||
23 | ||
24 | ||
25 | <a name="4.2.5"></a> | |
26 | ## [4.2.5](https://github.com/w3c/epubcheck/compare/v4.2.4...v4.2.5) (2021-03-15) | |
27 | ||
28 | ||
29 | ### Features | |
30 | ||
31 | * check playback properties do not have 'refines' attribute ([05a6a20](https://github.com/w3c/epubcheck/commit/05a6a20)) | |
32 | * check reading order of Media Overlays text elements ([e35bd05](https://github.com/w3c/epubcheck/commit/e35bd05)) | |
33 | * check references between Media Overlays and Content documents ([f49aa84](https://github.com/w3c/epubcheck/commit/f49aa84)) | |
34 | * check remote resource usage in Media Overlays ([df16ede](https://github.com/w3c/epubcheck/commit/df16ede)) | |
35 | * check required cardinality of meta properties ([edcd253](https://github.com/w3c/epubcheck/commit/edcd253)), closes [#1121](https://github.com/w3c/epubcheck/issues/1121) | |
36 | * check that Media Overlays are only defined for XHTML and SVG content documents ([5ae1aa9](https://github.com/w3c/epubcheck/commit/5ae1aa9)) | |
37 | * check the epub:textref attribute on Media Overlays body and seq elements ([eea1574](https://github.com/w3c/epubcheck/commit/eea1574)) | |
38 | * improve checking of audio clip times in Media Overlays ([11b652e](https://github.com/w3c/epubcheck/commit/11b652e)) | |
39 | * report unknown 'epub:type' values in overlays as USAGE only ([#1171](https://github.com/w3c/epubcheck/issues/1171)) ([f8a2517](https://github.com/w3c/epubcheck/commit/f8a2517)) | |
40 | * update HTML schemas from the HTML Checker ([56dcbd1](https://github.com/w3c/epubcheck/commit/56dcbd1)) | |
41 | * verify 'media:duration' property use valid SMIL clock values ([794b7ce](https://github.com/w3c/epubcheck/commit/794b7ce)), closes [#1174](https://github.com/w3c/epubcheck/issues/1174) | |
42 | ||
43 | ### Bug Fixes | |
44 | ||
45 | * allow empty `xml:lang` attributes in Package Documents ([177af8f](https://github.com/w3c/epubcheck/commit/177af8f)), closes [#777](https://github.com/w3c/epubcheck/issues/777) | |
46 | * allow the 'glossary' manifest item property ([d1727d8](https://github.com/w3c/epubcheck/commit/d1727d8)), closes [#1170](https://github.com/w3c/epubcheck/issues/1170) | |
47 | * do not report fragment-only CSS URLs ([6fa3312](https://github.com/w3c/epubcheck/commit/6fa3312)), closes [#1198](https://github.com/w3c/epubcheck/issues/1198) | |
48 | * do not require the Navigation Document to have an index in an Index Publication ([33f2f99](https://github.com/w3c/epubcheck/commit/33f2f99)), closes [#1122](https://github.com/w3c/epubcheck/issues/1122) | |
49 | * do not treat escaped <a> elements as hyperlinks in HTM-053 ([5949b6c](https://github.com/w3c/epubcheck/commit/5949b6c)), closes [#1182](https://github.com/w3c/epubcheck/issues/1182) | |
50 | * remove the user directory only at the start of paths (in messages) ([5ee72e7](https://github.com/w3c/epubcheck/commit/5ee72e7)), closes [#1181](https://github.com/w3c/epubcheck/issues/1181) | |
51 | ||
1 | 52 | |
2 | 53 | <a name="4.2.4"></a> |
3 | 54 | ## [4.2.4](https://github.com/w3c/epubcheck/compare/v4.2.3...v4.2.4) (2020-06-23) |
4 | ||
5 | This is the **latest production-ready** release of EPUBCheck, which provides complete support for checking conformance to the [EPUB 3.2](https://w3c.github.io/publ-epub-revision/epub32/spec/epub-spec.html) family of specifications. | |
6 | ||
7 | Version 4.2.4 is a maintenance release, made by the DAISY Consortium for the W3C. Many thanks to everyone who contributed! | |
8 | ||
9 | This EPUBCheck version is also available in the Maven Central Repository as [`org.w3c:epubcheck:4.2.4`](https://search.maven.org/artifact/org.w3c/epubcheck/4.2.4/jar). | |
10 | 55 | |
11 | 56 | ### Bug Fixes |
12 | 57 |
19 | 19 | |
20 | 20 | Check the [releases page](https://github.com/w3c/epubcheck/releases) to get the latest distribution. |
21 | 21 | |
22 | [EPUBCheck 4.2.4](https://github.com/w3c/epubcheck/releases/tag/v4.2.4) is the latest production-ready release, to be used to validate both EPUB 2 and 3 files. EPUB 3 publications are checked against the EPUB 3.2 family of specifications. | |
22 | [EPUBCheck 4.2.6](https://github.com/w3c/epubcheck/releases/tag/v4.2.6) is the latest production-ready release, to be used to validate both EPUB 2 and 3 files. EPUB 3 publications are checked against the EPUB 3.2 family of specifications. | |
23 | 23 | |
24 | 24 | ## Documentation |
25 | 25 | |
32 | 32 | ## Building EPUBCheck |
33 | 33 | |
34 | 34 | To build epubcheck from the sources you need Java Development Kit (JDK) 1.7 or above and [Apache Maven](http://maven.apache.org/) 3.0 or above installed. |
35 | ||
36 | You will also need Python to be able to run the BookReporter and related tools. | |
37 | 35 | |
38 | 36 | Build and run tests: |
39 | 37 |
10 | 10 | |
11 | 11 | <groupId>org.w3c</groupId> |
12 | 12 | <artifactId>epubcheck</artifactId> |
13 | <version>4.2.4</version> | |
13 | <version>4.2.6</version> | |
14 | 14 | |
15 | 15 | <packaging>jar</packaging> |
16 | 16 | |
142 | 142 | <connection>scm:git:ssh://git@github.com:w3c/epubcheck.git</connection> |
143 | 143 | <developerConnection>scm:git:ssh://git@github.com:w3c/epubcheckgit</developerConnection> |
144 | 144 | <url>https://github.com/w3c/epubcheck</url> |
145 | <tag>v4.2.4</tag> | |
145 | <tag>v4.2.6</tag> | |
146 | 146 | </scm> |
147 | 147 | <issueManagement> |
148 | 148 | <system>Github</system> |
156 | 156 | <maven.build.timestamp.format>yyyy-MM-dd</maven.build.timestamp.format> |
157 | 157 | <tool.build.date>${maven.build.timestamp}</tool.build.date> |
158 | 158 | </properties> |
159 | ||
160 | <prerequisites> | |
161 | <maven>3.0</maven> | |
162 | </prerequisites> | |
163 | 159 | |
164 | 160 | <dependencies> |
165 | 161 | <dependency> |
195 | 191 | <version>1.3</version> |
196 | 192 | </dependency> |
197 | 193 | <dependency> |
198 | <groupId>junit</groupId> | |
199 | <artifactId>junit</artifactId> | |
200 | <version>4.10</version> | |
201 | <scope>test</scope> | |
202 | </dependency> | |
203 | <dependency> | |
204 | 194 | <groupId>com.google.guava</groupId> |
205 | 195 | <artifactId>guava</artifactId> |
206 | 196 | <version>24.1.1-android</version> |
220 | 210 | <artifactId>jackson-mapper-asl</artifactId> |
221 | 211 | <version>1.9.12</version> |
222 | 212 | </dependency> |
213 | <!-- ================= --> | |
214 | <!-- Test Dependencies --> | |
215 | <!-- ================= --> | |
216 | <dependency> | |
217 | <groupId>junit</groupId> | |
218 | <artifactId>junit</artifactId> | |
219 | <version>4.13.1</version> | |
220 | <scope>test</scope> | |
221 | </dependency> | |
222 | <dependency> | |
223 | <groupId>com.googlecode.json-simple</groupId> | |
224 | <artifactId>json-simple</artifactId> | |
225 | <version>1.1.1</version> | |
226 | <scope>test</scope> | |
227 | </dependency> | |
223 | 228 | <dependency> |
224 | 229 | <groupId>xmlunit</groupId> |
225 | 230 | <artifactId>xmlunit</artifactId> |
226 | 231 | <version>1.3</version> |
232 | <scope>test</scope> | |
233 | </dependency> | |
234 | <dependency> | |
235 | <groupId>io.cucumber</groupId> | |
236 | <artifactId>cucumber-java</artifactId> | |
237 | <version>4.5.4</version> | |
238 | <scope>test</scope> | |
239 | </dependency> | |
240 | <dependency> | |
241 | <groupId>io.cucumber</groupId> | |
242 | <artifactId>cucumber-junit</artifactId> | |
243 | <version>4.5.4</version> | |
244 | <scope>test</scope> | |
245 | </dependency> | |
246 | <dependency> | |
247 | <groupId>io.cucumber</groupId> | |
248 | <artifactId>cucumber-picocontainer</artifactId> | |
249 | <version>4.7.1</version> | |
250 | <scope>test</scope> | |
251 | </dependency> | |
252 | <dependency> | |
253 | <groupId>org.hamcrest</groupId> | |
254 | <artifactId>hamcrest</artifactId> | |
255 | <version>2.1</version> | |
227 | 256 | <scope>test</scope> |
228 | 257 | </dependency> |
229 | 258 | </dependencies> |
254 | 283 | <plugins> |
255 | 284 | <plugin> |
256 | 285 | <groupId>org.apache.maven.plugins</groupId> |
286 | <artifactId>maven-enforcer-plugin</artifactId> | |
287 | <version>3.0.0-M3</version> | |
288 | <executions> | |
289 | <execution> | |
290 | <id>enforce-maven</id> | |
291 | <goals> | |
292 | <goal>enforce</goal> | |
293 | </goals> | |
294 | <configuration> | |
295 | <rules> | |
296 | <requireMavenVersion> | |
297 | <version>3.0</version> | |
298 | </requireMavenVersion> | |
299 | </rules> | |
300 | </configuration> | |
301 | </execution> | |
302 | </executions> | |
303 | </plugin> | |
304 | <plugin> | |
305 | <groupId>org.apache.maven.plugins</groupId> | |
257 | 306 | <artifactId>maven-dependency-plugin</artifactId> |
258 | 307 | <version>3.0.2</version> |
259 | 308 | <executions> |
321 | 370 | <goals> |
322 | 371 | <goal>test</goal> |
323 | 372 | </goals> |
324 | <configuration> | |
325 | <excludes> | |
326 | <exclude>**/message_coverage.java</exclude> | |
327 | </excludes> | |
328 | <properties> | |
329 | <property> | |
330 | <name>listener</name> | |
331 | <value>com.adobe.epubcheck.test.TestRunListener</value> | |
332 | </property> | |
333 | </properties> | |
334 | </configuration> | |
335 | </execution> | |
336 | <execution> | |
337 | <id>coverage-test</id> | |
338 | <phase>package</phase> | |
339 | <goals> | |
340 | <goal>test</goal> | |
341 | </goals> | |
342 | <configuration> | |
343 | <includes> | |
344 | <include>**/message_coverage.java</include> | |
345 | </includes> | |
346 | </configuration> | |
347 | 373 | </execution> |
348 | 374 | </executions> |
349 | 375 | </plugin> |
403 | 429 | <plugin> |
404 | 430 | <groupId>org.codehaus.mojo</groupId> |
405 | 431 | <artifactId>license-maven-plugin</artifactId> |
406 | <version>1.14</version> | |
432 | <version>1.20</version> | |
407 | 433 | <executions> |
408 | 434 | <execution> |
409 | 435 | <id>thirdparty-licenses</id> |
412 | 438 | </goals> |
413 | 439 | <configuration> |
414 | 440 | <includedScopes>runtime,compile</includedScopes> |
415 | <fileTemplate>${project.basedir}/src/main/licenses/third-party.ftl</fileTemplate> | |
441 | <fileTemplate>${project.basedir}/src/main/licenses/THIRD-PARTY.ftl</fileTemplate> | |
442 | <missingFile>${project.basedir}/src/main/licenses/THIRD-PARTY.properties</missingFile> | |
443 | <deployMissingFile>false</deployMissingFile> | |
444 | <useMissingFile>true</useMissingFile> | |
445 | <includedScopes>runtime,compile</includedScopes> | |
446 | <licenseMerges> | |
447 | <licenseMerge>Apache License, Version 2.0|The Apache Software License, Version 2.0|Apache 2.0</licenseMerge> | |
448 | <licenseMerge>The 3-Clause BSD License|The BSD License|3-clause BSD license</licenseMerge> | |
449 | <licenseMerge>The MIT License|MIT license</licenseMerge> | |
450 | </licenseMerges> | |
416 | 451 | </configuration> |
417 | 452 | </execution> |
418 | 453 | </executions> |
43 | 43 | |
44 | 44 | echo "" |
45 | 45 | echo "Processing file '${file}'" |
46 | file ${file} | grep 'ISO-8859' > /dev/null | |
46 | file ${file} | grep -e 'ISO-8859\|data' > /dev/null | |
47 | 47 | if [ $? -eq 0 ]; then |
48 | 48 | escapeISO88591 ${file} |
49 | 49 | fi |
33 | 33 | <include>*.txt</include> |
34 | 34 | </includes> |
35 | 35 | </fileSet> |
36 | <!--<fileSet> | |
37 | <outputDirectory>docs</outputDirectory> | |
38 | <directory>docs</directory> | |
39 | <includes> | |
40 | <include>*.docx</include> | |
41 | <include>*.pdf</include> | |
42 | </includes> | |
43 | </fileSet>--> | |
44 | <!--<fileSet> | |
45 | <outputDirectory></outputDirectory> | |
46 | <directory>target</directory> | |
47 | <includes> | |
48 | <include>*.py</include> | |
49 | <include>*.pdf</include> | |
50 | </includes> | |
51 | </fileSet>--> | |
52 | 36 | </fileSets> |
53 | 37 | <dependencySets> |
54 | 38 | <dependencySet> |
55 | 39 | <useProjectArtifact>true</useProjectArtifact> |
56 | 40 | <useTransitiveDependencies>false</useTransitiveDependencies> |
57 | 41 | <includes> |
58 | <include>com.adobe:epubcheck</include> | |
42 | <include>org.w3c:epubcheck</include> | |
59 | 43 | </includes> |
60 | 44 | <outputFileNameMapping>${artifact.artifactId}.${artifact.extension}</outputFileNameMapping> |
61 | 45 | </dependencySet> |
319 | 319 | { |
320 | 320 | if (uri != null && uri.trim().length() > 0) |
321 | 321 | { |
322 | String resolved = PathUtil.resolveRelativeReference(path, uri); | |
323 | xrefChecker.registerReference(path, correctedLineNumber(line), correctedColumnNumber(line, col), resolved, type); | |
324 | if (PathUtil.isRemote(resolved)) { | |
325 | detectedProperties.add(ITEM_PROPERTIES.REMOTE_RESOURCES); | |
322 | // Fragment-only URLs should be resolved relative to the host document | |
323 | // Since we don't have access to the path of the host document(s) here, | |
324 | // we ignore this case | |
325 | if (!uri.startsWith("#")) { | |
326 | String resolved = PathUtil.resolveRelativeReference(path, uri); | |
327 | xrefChecker.registerReference(path, correctedLineNumber(line), correctedColumnNumber(line, col), resolved, type); | |
328 | if (PathUtil.isRemote(resolved)) { | |
329 | detectedProperties.add(ITEM_PROPERTIES.REMOTE_RESOURCES); | |
330 | } | |
326 | 331 | } |
327 | 332 | } |
328 | 333 | else |
19 | 19 | * ========================================================<br/> |
20 | 20 | */ |
21 | 21 | public class FileLinkSearch extends TextSearch { |
22 | private static final Pattern fileLinkPattern = Pattern.compile("href=[\"']file://"); | |
22 | private static final Pattern fileLinkPattern = Pattern.compile("<a\\s([^<>]*\\s)?href=[\"']file://"); | |
23 | 23 | |
24 | 24 | public FileLinkSearch(EPUBVersion version, ZipFile zip, Report report) |
25 | 25 | { |
143 | 143 | severities.put(MessageId.MED_005, Severity.ERROR); |
144 | 144 | severities.put(MessageId.MED_006, Severity.USAGE); |
145 | 145 | severities.put(MessageId.MED_007, Severity.ERROR); |
146 | severities.put(MessageId.MED_008, Severity.ERROR); | |
147 | severities.put(MessageId.MED_009, Severity.ERROR); | |
148 | severities.put(MessageId.MED_010, Severity.ERROR); | |
149 | severities.put(MessageId.MED_011, Severity.ERROR); | |
150 | severities.put(MessageId.MED_012, Severity.ERROR); | |
151 | severities.put(MessageId.MED_013, Severity.ERROR); | |
152 | severities.put(MessageId.MED_014, Severity.ERROR); | |
153 | severities.put(MessageId.MED_015, Severity.USAGE); | |
146 | 154 | |
147 | 155 | // NAV |
148 | 156 | severities.put(MessageId.NAV_001, Severity.ERROR); |
137 | 137 | MED_005("MED-005"), |
138 | 138 | MED_006("MED_006"), |
139 | 139 | MED_007("MED_007"), |
140 | MED_008("MED-008"), | |
141 | MED_009("MED-009"), | |
142 | MED_010("MED_010"), | |
143 | MED_011("MED_011"), | |
144 | MED_012("MED_012"), | |
145 | MED_013("MED_013"), | |
146 | MED_014("MED_014"), | |
147 | MED_015("MED_015"), | |
140 | 148 | |
141 | 149 | // Epub3 based table of content errors |
142 | 150 | NAV_001("NAV-001"), |
49 | 49 | @SuppressWarnings("unchecked") |
50 | 50 | private final static ValidatorMap validatorMap = ValidatorMap.builder() |
51 | 51 | .putAll(XMLValidators.NAV_30_RNC, XMLValidators.XHTML_30_SCH, XMLValidators.NAV_30_SCH) |
52 | .putAll(and(Predicates.or(profile(EPUBProfile.EDUPUB), hasPubType(OPFData.DC_TYPE_EDUPUB)), | |
53 | not( | |
54 | hasProp(EpubCheckVocab.VOCAB.get(EpubCheckVocab.PROPERTIES.FIXED_LAYOUT))), | |
52 | .putAll( | |
53 | and(Predicates.or(profile(EPUBProfile.EDUPUB), hasPubType(OPFData.DC_TYPE_EDUPUB)), | |
54 | not(hasProp(EpubCheckVocab.VOCAB.get(EpubCheckVocab.PROPERTIES.FIXED_LAYOUT))), | |
55 | 55 | not(hasProp(EpubCheckVocab.VOCAB.get(EpubCheckVocab.PROPERTIES.NON_LINEAR)))), |
56 | 56 | XMLValidators.XHTML_EDUPUB_STRUCTURE_SCH, XMLValidators.XHTML_EDUPUB_SEMANTICS_SCH, |
57 | 57 | XMLValidators.XHTML_IDX_SCH) |
60 | 60 | mimetype("application/xhtml+xml"), version(EPUBVersion.VERSION_3)), |
61 | 61 | XMLValidators.XHTML_DICT_SCH) |
62 | 62 | .putAll( |
63 | and(or(profile(EPUBProfile.IDX), hasPubType(OPFData.DC_TYPE_INDEX), | |
64 | hasProp(PackageVocabs.ITEM_VOCAB.get(PackageVocabs.ITEM_PROPERTIES.INDEX)), | |
63 | and(or(hasProp(PackageVocabs.ITEM_VOCAB.get(PackageVocabs.ITEM_PROPERTIES.INDEX)), | |
65 | 64 | hasProp(EpubCheckVocab.VOCAB.get(EpubCheckVocab.PROPERTIES.IN_INDEX_COLLECTION))), |
66 | mimetype("application/xhtml+xml"), version(EPUBVersion.VERSION_3)), | |
65 | mimetype("application/xhtml+xml"), version(EPUBVersion.VERSION_3)), | |
67 | 66 | XMLValidators.XHTML_IDX_SCH, XMLValidators.XHTML_IDX_INDEX_SCH) |
68 | 67 | .build(); |
69 | 68 |
42 | 42 | import com.adobe.epubcheck.ocf.OCFPackage; |
43 | 43 | import com.adobe.epubcheck.opf.ValidationContext.ValidationContextBuilder; |
44 | 44 | import com.adobe.epubcheck.ops.OPSCheckerFactory; |
45 | import com.adobe.epubcheck.overlay.OverlayTextChecker; | |
45 | 46 | import com.adobe.epubcheck.util.EPUBVersion; |
46 | 47 | import com.adobe.epubcheck.util.FeatureEnum; |
47 | 48 | import com.adobe.epubcheck.util.PathUtil; |
113 | 114 | newContext.pubTypes(opfData != null ? opfData.getTypes() : null); |
114 | 115 | newContext.xrefChecker(new XRefChecker(context.ocf.get(), context.report, context.version)); |
115 | 116 | newContext.profile(EPUBProfile.makeOPFCompatible(context.profile, opfData, path, report)); |
117 | newContext.overlayTextChecker(new OverlayTextChecker()); | |
116 | 118 | } |
117 | 119 | this.context = newContext.build(); |
118 | 120 |
39 | 39 | import com.adobe.epubcheck.opf.ResourceCollection.Roles; |
40 | 40 | import com.adobe.epubcheck.ops.OPSCheckerFactory; |
41 | 41 | import com.adobe.epubcheck.overlay.OverlayCheckerFactory; |
42 | import com.adobe.epubcheck.overlay.OverlayTextChecker; | |
42 | 43 | import com.adobe.epubcheck.util.EPUBVersion; |
43 | 44 | import com.adobe.epubcheck.util.FeatureEnum; |
44 | 45 | import com.adobe.epubcheck.util.PathUtil; |
178 | 179 | else { |
179 | 180 | report.message(MessageId.RSC_006, |
180 | 181 | EPUBLocation.create(path, item.getLineNumber(), item.getColumnNumber()), item.getPath()); |
182 | } | |
183 | } | |
184 | } | |
185 | ||
186 | if (isBlessedItemType(mediatype, version)) { | |
187 | // check whether media-overlay attribute needs to be specified | |
188 | OverlayTextChecker overlayTextChecker = context.overlayTextChecker.get(); | |
189 | String mo = item.getMediaOverlay(); | |
190 | String docpath = item.getPath(); | |
191 | if (overlayTextChecker.isReferencedByOverlay(docpath)) { | |
192 | if (Strings.isNullOrEmpty(mo)) { | |
193 | // missing media-overlay attribute | |
194 | report.message(MessageId.MED_010, EPUBLocation.create(path, item.getLineNumber(), item.getColumnNumber(), item.getPath())); | |
195 | } | |
196 | else if (!overlayTextChecker.isCorrectOverlay(docpath,mo)) { | |
197 | // media-overlay attribute references the wrong media overlay | |
198 | report.message(MessageId.MED_012, EPUBLocation.create(path, item.getLineNumber(), item.getColumnNumber(), item.getPath())); | |
199 | } | |
200 | } | |
201 | else { | |
202 | if (!Strings.isNullOrEmpty(mo)) { | |
203 | // referenced overlay does not reference this content document | |
204 | report.message(MessageId.MED_013, EPUBLocation.create(path, item.getLineNumber(), item.getColumnNumber(), item.getPath())); | |
181 | 205 | } |
182 | 206 | } |
183 | 207 | } |
313 | 313 | itemBuilders.put(id.trim(), itemBuilder); |
314 | 314 | itemBuildersByPath.put(href, itemBuilder); |
315 | 315 | |
316 | String mediaOverlay = e.getAttribute("media-overlay"); | |
317 | itemBuilder.mediaOverlay(mediaOverlay); | |
318 | ||
316 | 319 | report.info(href, FeatureEnum.UNIQUE_IDENT, id); |
317 | 320 | } |
318 | 321 | } |
53 | 53 | private final boolean scripted; |
54 | 54 | private final boolean linear; |
55 | 55 | private final boolean fixedLayout; |
56 | private final String mediaOverlay; | |
56 | 57 | |
57 | 58 | private OPFItem(String id, String path, String mimetype, int lineNumber, int columnNumber, |
58 | 59 | Optional<String> fallback, Optional<String> fallbackStyle, Set<Property> properties, |
59 | boolean ncx, int spinePosition, boolean nav, boolean scripted, boolean linear, boolean fxl) | |
60 | boolean ncx, int spinePosition, boolean nav, boolean scripted, boolean linear, boolean fxl, String mediaOverlay) | |
60 | 61 | { |
61 | 62 | this.id = id; |
62 | 63 | this.path = path; |
73 | 74 | this.scripted = scripted; |
74 | 75 | this.linear = linear; |
75 | 76 | this.fixedLayout = fxl; |
77 | this.mediaOverlay = mediaOverlay; | |
76 | 78 | } |
77 | 79 | |
78 | 80 | /** |
237 | 239 | public boolean isFixedLayout() |
238 | 240 | { |
239 | 241 | return fixedLayout; |
242 | } | |
243 | ||
244 | public String getMediaOverlay() | |
245 | { | |
246 | return mediaOverlay; | |
240 | 247 | } |
241 | 248 | |
242 | 249 | @Override |
292 | 299 | private boolean linear = true; |
293 | 300 | private int spinePosition = -1; |
294 | 301 | private boolean fxl = false; |
302 | private String mediaOverlay; | |
295 | 303 | private ImmutableSet.Builder<Property> propertiesBuilder = new ImmutableSet.Builder<Property>(); |
296 | 304 | |
297 | 305 | /** |
338 | 346 | |
339 | 347 | } |
340 | 348 | |
349 | public Builder mediaOverlay(String path) | |
350 | { | |
351 | this.mediaOverlay = path; | |
352 | return this; | |
353 | } | |
354 | ||
341 | 355 | public Builder ncx() |
342 | 356 | { |
343 | 357 | this.ncx = true; |
387 | 401 | properties, ncx, spinePosition, |
388 | 402 | properties.contains(PackageVocabs.ITEM_VOCAB.get(PackageVocabs.ITEM_PROPERTIES.NAV)), |
389 | 403 | properties.contains(PackageVocabs.ITEM_VOCAB.get(PackageVocabs.ITEM_PROPERTIES.SCRIPTED)), |
390 | linear, fxl); | |
404 | linear, fxl, mediaOverlay); | |
391 | 405 | } |
392 | 406 | } |
393 | 407 | } |
9 | 9 | import com.adobe.epubcheck.api.LocalizableReport; |
10 | 10 | import com.adobe.epubcheck.api.Report; |
11 | 11 | import com.adobe.epubcheck.ocf.OCFPackage; |
12 | import com.adobe.epubcheck.overlay.OverlayTextChecker; | |
12 | 13 | import com.adobe.epubcheck.util.EPUBVersion; |
13 | 14 | import com.adobe.epubcheck.util.GenericResourceProvider; |
14 | 15 | import com.adobe.epubcheck.vocab.Property; |
77 | 78 | */ |
78 | 79 | public final Optional<XRefChecker> xrefChecker; |
79 | 80 | /** |
81 | * The src checker for media overlay text elements, absent for single-file validation | |
82 | */ | |
83 | public final Optional<OverlayTextChecker> overlayTextChecker; | |
84 | /** | |
80 | 85 | * The set of 'dc:type' values declared at the OPF level. Guaranteed non-null, |
81 | 86 | * can be empty. |
82 | 87 | */ |
89 | 94 | private ValidationContext(String path, String mimeType, EPUBVersion version, EPUBProfile profile, |
90 | 95 | Report report, Locale locale, FeatureReport featureReport, |
91 | 96 | GenericResourceProvider resourceProvider, Optional<OPFItem> opfItem, Optional<OCFPackage> ocf, |
92 | Optional<XRefChecker> xrefChecker, Set<String> pubTypes, Set<Property> properties) | |
97 | Optional<XRefChecker> xrefChecker, Optional<OverlayTextChecker> overlayTextChecker, Set<String> pubTypes, Set<Property> properties) | |
93 | 98 | { |
94 | 99 | super(); |
95 | 100 | this.path = path; |
103 | 108 | this.opfItem = opfItem; |
104 | 109 | this.ocf = ocf; |
105 | 110 | this.xrefChecker = xrefChecker; |
111 | this.overlayTextChecker = overlayTextChecker; | |
106 | 112 | this.pubTypes = pubTypes; |
107 | 113 | this.properties = properties; |
108 | 114 | } |
124 | 130 | private GenericResourceProvider resourceProvider = null; |
125 | 131 | private OCFPackage ocf = null; |
126 | 132 | private XRefChecker xrefChecker = null; |
133 | private OverlayTextChecker overlayTextChecker = null; | |
127 | 134 | private Set<String> pubTypes = null; |
128 | 135 | private ImmutableSet.Builder<Property> properties = ImmutableSet.<Property> builder(); |
129 | 136 | |
147 | 154 | resourceProvider = context.resourceProvider; |
148 | 155 | ocf = context.ocf.orNull(); |
149 | 156 | xrefChecker = context.xrefChecker.orNull(); |
157 | overlayTextChecker = context.overlayTextChecker.orNull(); | |
150 | 158 | pubTypes = context.pubTypes; |
151 | 159 | properties = ImmutableSet.<Property> builder().addAll(context.properties); |
152 | 160 | return this; |
203 | 211 | public ValidationContextBuilder xrefChecker(XRefChecker xrefChecker) |
204 | 212 | { |
205 | 213 | this.xrefChecker = xrefChecker; |
214 | return this; | |
215 | } | |
216 | ||
217 | public ValidationContextBuilder overlayTextChecker(OverlayTextChecker overlayTextChecker) | |
218 | { | |
219 | this.overlayTextChecker = overlayTextChecker; | |
206 | 220 | return this; |
207 | 221 | } |
208 | 222 | |
242 | 256 | profile != null ? profile : EPUBProfile.DEFAULT, report, locale, |
243 | 257 | featureReport != null ? featureReport : new FeatureReport(), resourceProvider, |
244 | 258 | (xrefChecker != null) ? xrefChecker.getResource(path) : Optional.<OPFItem> absent(), |
245 | Optional.fromNullable(ocf), Optional.fromNullable(xrefChecker), | |
259 | Optional.fromNullable(ocf), Optional.fromNullable(xrefChecker), Optional.fromNullable(overlayTextChecker), | |
246 | 260 | pubTypes != null ? ImmutableSet.copyOf(pubTypes) : ImmutableSet.<String> of(), |
247 | 261 | properties.build()); |
248 | 262 | } |
68 | 68 | SEARCH_KEY, |
69 | 69 | NAV_TOC_LINK, |
70 | 70 | NAV_PAGELIST_LINK, |
71 | OVERLAY_TEXT_LINK, | |
71 | 72 | PICTURE_SOURCE, |
72 | 73 | PICTURE_SOURCE_FOREIGN; |
73 | 74 | } |
285 | 286 | // if (checkReference(reference)) checkReferenceSubtypes(reference); |
286 | 287 | Queue<Reference> tocLinks = new LinkedList<>(); |
287 | 288 | Queue<Reference> pageListLinks = new LinkedList<>(); |
289 | Queue<Reference> overlayLinks = new LinkedList<>(); | |
288 | 290 | for (Reference reference : references) |
289 | 291 | { |
290 | 292 | switch (reference.type) |
298 | 300 | case NAV_PAGELIST_LINK: |
299 | 301 | pageListLinks.add(reference); |
300 | 302 | break; |
303 | case OVERLAY_TEXT_LINK: | |
304 | overlayLinks.add(reference); | |
305 | break; | |
301 | 306 | default: |
302 | 307 | checkReference(reference); |
303 | 308 | break; |
305 | 310 | } |
306 | 311 | checkReadingOrder(tocLinks, -1, -1); |
307 | 312 | checkReadingOrder(pageListLinks, -1, -1); |
313 | checkReadingOrder(overlayLinks, -1, -1); | |
308 | 314 | } |
309 | 315 | |
310 | 316 | private void checkReference(Reference ref) |
532 | 538 | if (ref == null) return; |
533 | 539 | |
534 | 540 | Preconditions |
535 | .checkArgument(ref.type == Type.NAV_PAGELIST_LINK || ref.type == Type.NAV_TOC_LINK); | |
541 | .checkArgument(ref.type == Type.NAV_PAGELIST_LINK || ref.type == Type.NAV_TOC_LINK || ref.type == Type.OVERLAY_TEXT_LINK); | |
536 | 542 | Resource res = resources.get(ref.refResource); |
537 | 543 | |
538 | 544 | // abort early if the link target is not a spine item (checked elsewhere) |
544 | 550 | { |
545 | 551 | String orderContext = LocalizedMessages.getInstance(locale).getSuggestion(MessageId.NAV_011, |
546 | 552 | "spine"); |
547 | report.message(MessageId.NAV_011, | |
548 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), | |
549 | (ref.type == Type.NAV_TOC_LINK) ? "toc" : "page-list", ref.value, orderContext); | |
550 | report.message(MessageId.INF_001, | |
551 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), "https://github.com/w3c/publ-epub-revision/issues/1283"); | |
552 | lastSpinePosition = targetSpinePosition; | |
553 | lastAnchorPosition = -1; | |
554 | } | |
555 | else | |
556 | { | |
557 | ||
558 | // if new spine item, reset last positions | |
559 | if (targetSpinePosition > lastSpinePosition) | |
560 | { | |
561 | lastSpinePosition = targetSpinePosition; | |
562 | lastAnchorPosition = -1; | |
563 | } | |
564 | ||
565 | // check that the fragment is in document order | |
566 | int targetAnchorPosition = res.getAnchorPosition(ref.fragment); | |
567 | if (targetAnchorPosition < lastAnchorPosition) | |
568 | { | |
569 | String orderContext = LocalizedMessages.getInstance(locale).getSuggestion(MessageId.NAV_011, | |
570 | "document"); | |
553 | ||
554 | if (ref.type == Type.OVERLAY_TEXT_LINK) { | |
555 | report.message(MessageId.MED_015, | |
556 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), ref.value, orderContext); | |
557 | } | |
558 | else { | |
571 | 559 | report.message(MessageId.NAV_011, |
572 | 560 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), |
573 | 561 | (ref.type == Type.NAV_TOC_LINK) ? "toc" : "page-list", ref.value, orderContext); |
574 | 562 | report.message(MessageId.INF_001, |
575 | 563 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), "https://github.com/w3c/publ-epub-revision/issues/1283"); |
576 | 564 | } |
565 | lastSpinePosition = targetSpinePosition; | |
566 | lastAnchorPosition = -1; | |
567 | } | |
568 | else | |
569 | { | |
570 | ||
571 | // if new spine item, reset last positions | |
572 | if (targetSpinePosition > lastSpinePosition) | |
573 | { | |
574 | lastSpinePosition = targetSpinePosition; | |
575 | lastAnchorPosition = -1; | |
576 | } | |
577 | ||
578 | // check that the fragment is in document order | |
579 | int targetAnchorPosition = res.getAnchorPosition(ref.fragment); | |
580 | if (targetAnchorPosition < lastAnchorPosition) | |
581 | { | |
582 | String orderContext = LocalizedMessages.getInstance(locale).getSuggestion(MessageId.NAV_011, | |
583 | "document"); | |
584 | if (ref.type == Type.OVERLAY_TEXT_LINK) { | |
585 | report.message(MessageId.MED_015, | |
586 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), ref.value, orderContext); | |
587 | } | |
588 | else { | |
589 | report.message(MessageId.NAV_011, | |
590 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), | |
591 | (ref.type == Type.NAV_TOC_LINK) ? "toc" : "page-list", ref.value, orderContext); | |
592 | report.message(MessageId.INF_001, | |
593 | EPUBLocation.create(ref.source, ref.lineNumber, ref.columnNumber), "https://github.com/w3c/publ-epub-revision/issues/1283"); | |
594 | } | |
595 | } | |
577 | 596 | lastAnchorPosition = targetAnchorPosition; |
578 | 597 | } |
579 | 598 | checkReadingOrder(references, lastSpinePosition, lastAnchorPosition); |
244 | 244 | allowedProperties.add(ITEM_PROPERTIES.INDEX); |
245 | 245 | context.featureReport.report(FeatureEnum.INDEX, parser.getLocation(), null); |
246 | 246 | } |
247 | if (types.contains(EPUB_TYPES.GLOSSARY)) | |
248 | { | |
249 | allowedProperties.add(ITEM_PROPERTIES.GLOSSARY); | |
250 | } | |
247 | 251 | } |
248 | 252 | |
249 | 253 | @Override |
0 | /* | |
1 | * This class was originally created by the DAISY Consortium | |
2 | * for another project, licensed under LGPL v2.1. | |
3 | * It is now integrated in EPUBCheck and relicensed under | |
4 | * EPUBCheck’s primary license. | |
5 | * See https://github.com/w3c/epubcheck/pull/1173 | |
6 | */ | |
7 | package com.adobe.epubcheck.overlay; | |
8 | ||
9 | public class ClipTime { | |
10 | ||
11 | private final Double timeInMs; | |
12 | ||
13 | public ClipTime() { | |
14 | timeInMs = null; | |
15 | } | |
16 | ||
17 | public ClipTime(double timeInMs) { | |
18 | this.timeInMs = new Double(timeInMs); | |
19 | } | |
20 | ||
21 | public double getTimeInMs() { | |
22 | if(notSet()) { | |
23 | return 0; | |
24 | } else { | |
25 | return timeInMs; | |
26 | } | |
27 | } | |
28 | ||
29 | public ClipTime roundedToMilliSeconds() { | |
30 | return new ClipTime(Math.round(this.getTimeInMs())); | |
31 | } | |
32 | ||
33 | public ClipTime floorToMilliSeconds() { | |
34 | return new ClipTime(Math.floor(this.getTimeInMs())); | |
35 | } | |
36 | ||
37 | public boolean notSet() { | |
38 | if(this.timeInMs == null) { | |
39 | return true; | |
40 | } else { | |
41 | return false; | |
42 | } | |
43 | } | |
44 | ||
45 | public ClipTime add(ClipTime timeToAdd) { | |
46 | return new ClipTime(this.getTimeInMs() + timeToAdd.getTimeInMs()); | |
47 | } | |
48 | ||
49 | public ClipTime subtract(ClipTime timeToSubtract) { | |
50 | return new ClipTime(this.getTimeInMs() - timeToSubtract.getTimeInMs()); | |
51 | } | |
52 | }⏎ |
0 | 0 | package com.adobe.epubcheck.overlay; |
1 | 1 | |
2 | import java.util.EnumSet; | |
3 | import java.util.HashSet; | |
2 | 4 | import java.util.Map; |
3 | 5 | import java.util.Set; |
4 | 6 | |
11 | 13 | import com.adobe.epubcheck.util.EpubConstants; |
12 | 14 | import com.adobe.epubcheck.util.HandlerUtil; |
13 | 15 | import com.adobe.epubcheck.util.PathUtil; |
16 | import com.adobe.epubcheck.vocab.AggregateVocab; | |
17 | import com.adobe.epubcheck.vocab.PackageVocabs; | |
18 | import com.adobe.epubcheck.vocab.PackageVocabs.ITEM_PROPERTIES; | |
19 | import com.adobe.epubcheck.vocab.Property; | |
14 | 20 | import com.adobe.epubcheck.vocab.StructureVocab; |
15 | 21 | import com.adobe.epubcheck.vocab.Vocab; |
16 | 22 | import com.adobe.epubcheck.vocab.VocabUtil; |
17 | 23 | import com.adobe.epubcheck.xml.XMLElement; |
18 | 24 | import com.adobe.epubcheck.xml.XMLHandler; |
19 | 25 | import com.adobe.epubcheck.xml.XMLParser; |
26 | import com.google.common.base.Strings; | |
20 | 27 | import com.google.common.collect.ImmutableMap; |
21 | 28 | import com.google.common.collect.ImmutableSet; |
29 | import com.google.common.collect.Sets; | |
22 | 30 | |
23 | 31 | public class OverlayHandler implements XMLHandler |
24 | 32 | { |
25 | 33 | |
26 | 34 | private static Map<String, Vocab> RESERVED_VOCABS = ImmutableMap.<String, Vocab> of("", |
27 | StructureVocab.VOCAB); | |
35 | AggregateVocab.of(StructureVocab.VOCAB, StructureVocab.UNCHECKED_VOCAB)); | |
28 | 36 | private static Map<String, Vocab> KNOWN_VOCAB_URIS = ImmutableMap.of(); |
29 | 37 | private static Set<String> DEFAULT_VOCAB_URIS = ImmutableSet.of(StructureVocab.URI); |
30 | ||
38 | ||
31 | 39 | private final ValidationContext context; |
32 | 40 | private final String path; |
33 | 41 | private final Report report; |
36 | 44 | private boolean checkedUnsupportedXMLVersion; |
37 | 45 | |
38 | 46 | private Map<String, Vocab> vocabs = RESERVED_VOCABS; |
39 | ||
47 | ||
48 | private Set<String> resourceRefs = new HashSet<String>(); | |
49 | ||
50 | private final Set<ITEM_PROPERTIES> requiredProperties = EnumSet.noneOf(ITEM_PROPERTIES.class); | |
51 | ||
40 | 52 | public OverlayHandler(ValidationContext context, XMLParser parser) |
41 | 53 | { |
42 | 54 | this.context = context; |
57 | 69 | XMLElement e = parser.getCurrentElement(); |
58 | 70 | String name = e.getName(); |
59 | 71 | |
60 | if (name.equals("smil")) | |
61 | { | |
62 | vocabs = VocabUtil.parsePrefixDeclaration( | |
63 | e.getAttributeNS(EpubConstants.EpubTypeNamespaceUri, "prefix"), RESERVED_VOCABS, | |
64 | KNOWN_VOCAB_URIS, DEFAULT_VOCAB_URIS, report, | |
65 | EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber())); | |
66 | } | |
67 | else if (name.equals("seq")) | |
68 | { | |
69 | processSeq(e); | |
70 | } | |
71 | else if (name.equals("text")) | |
72 | { | |
73 | processSrc(e); | |
74 | } | |
75 | else if (name.equals("audio")) | |
76 | { | |
77 | processRef(e.getAttribute("src"), XRefChecker.Type.AUDIO); | |
78 | } | |
79 | else if (name.equals("body") || name.equals("par")) | |
80 | { | |
81 | checkType(e.getAttributeNS(EpubConstants.EpubTypeNamespaceUri, "type")); | |
72 | switch (name) { | |
73 | case "smil": | |
74 | vocabs = VocabUtil.parsePrefixDeclaration( | |
75 | e.getAttributeNS(EpubConstants.EpubTypeNamespaceUri, "prefix"), RESERVED_VOCABS, | |
76 | KNOWN_VOCAB_URIS, DEFAULT_VOCAB_URIS, report, | |
77 | EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber())); | |
78 | break; | |
79 | ||
80 | case "body": | |
81 | case "seq": | |
82 | case "par": | |
83 | processGlobalAttrs(e); | |
84 | break; | |
85 | ||
86 | case "text": | |
87 | processTextSrc(e); | |
88 | break; | |
89 | ||
90 | case "audio": | |
91 | processAudioSrc(e); | |
92 | checkTime(e.getAttribute("clipBegin"), e.getAttribute("clipEnd")); | |
93 | break; | |
94 | } | |
95 | } | |
96 | ||
97 | private void checkTime(String clipBegin, String clipEnd) { | |
98 | ||
99 | if (clipEnd == null) { | |
100 | // missing clipEnd attribute means clip plays to end so no comparisons possible | |
101 | return; | |
102 | } | |
103 | ||
104 | if (clipBegin == null) { | |
105 | // set clipBegin to 0 if the attribute isn't set to allow comparisons | |
106 | clipBegin = "0"; | |
107 | } | |
108 | ||
109 | SmilClock start; | |
110 | SmilClock end; | |
111 | ||
112 | try { | |
113 | start = new SmilClock(clipBegin); | |
114 | end = new SmilClock(clipEnd); | |
115 | } | |
116 | catch (Exception ex) { | |
117 | // invalid clock time will be reported by the schema | |
118 | return; | |
119 | } | |
120 | ||
121 | if (start.compareTo(end) == 1) { | |
122 | // clipEnd is chronologically before clipBegin | |
123 | report.message(MessageId.MED_008, EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber())); | |
124 | } | |
125 | ||
126 | else if (start.equals(end)) { | |
127 | // clipBegin and clipEnd are equal | |
128 | report.message(MessageId.MED_009, EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber())); | |
82 | 129 | } |
83 | 130 | } |
84 | 131 | |
85 | 132 | private void checkType(String type) |
86 | 133 | { |
87 | VocabUtil.parsePropertyList(type, vocabs, context, | |
134 | Set<Property> propList = VocabUtil.parsePropertyList(type, vocabs, context, | |
88 | 135 | EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber())); |
89 | } | |
90 | ||
91 | private void processSrc(XMLElement e) | |
92 | { | |
93 | processRef(e.getAttribute("src"), XRefChecker.Type.HYPERLINK); | |
136 | ||
137 | // Check unrecognized properties from the structure vocab | |
138 | for (Property property : propList) | |
139 | { | |
140 | if (StructureVocab.URI.equals(property.getVocabURI())) try | |
141 | { | |
142 | property.toEnum(); | |
143 | } catch (UnsupportedOperationException ex) | |
144 | { | |
145 | report.message(MessageId.OPF_088, parser.getLocation(), property.getName()); | |
146 | } | |
147 | } | |
148 | } | |
149 | ||
150 | private void processTextSrc(XMLElement e) | |
151 | { | |
152 | String src = e.getAttribute("src"); | |
153 | ||
154 | processRef(src, XRefChecker.Type.HYPERLINK); | |
155 | ||
156 | String resolvedSrc = PathUtil.resolveRelativeReference(path, src); | |
157 | ||
158 | if (context.xrefChecker.isPresent()) | |
159 | { | |
160 | context.xrefChecker.get().registerReference(path, parser.getLineNumber(), | |
161 | parser.getColumnNumber(), resolvedSrc, XRefChecker.Type.OVERLAY_TEXT_LINK); | |
162 | } | |
163 | } | |
164 | ||
165 | private void processAudioSrc(XMLElement e) { | |
166 | ||
167 | String src = e.getAttribute("src"); | |
168 | ||
169 | processRef(src, XRefChecker.Type.AUDIO); | |
170 | ||
171 | if (src != null && PathUtil.isRemote(src)) | |
172 | { | |
173 | requiredProperties.add(ITEM_PROPERTIES.REMOTE_RESOURCES); | |
174 | } | |
94 | 175 | |
95 | 176 | } |
96 | 177 | |
107 | 188 | report.message(MessageId.MED_005, EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber()), ref, mimeType); |
108 | 189 | } |
109 | 190 | } |
191 | else { | |
192 | checkFragment(ref); | |
193 | String uniqueResource = PathUtil.removeFragment(ref); | |
194 | if (!Strings.isNullOrEmpty(uniqueResource)) { | |
195 | if (!context.overlayTextChecker.get().add(uniqueResource, context.opfItem.get().getId())) { | |
196 | report.message(MessageId.MED_011, EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber()), ref); | |
197 | } | |
198 | } | |
199 | } | |
110 | 200 | context.xrefChecker.get().registerReference(path, parser.getLineNumber(), |
111 | 201 | parser.getColumnNumber(), ref, type); |
112 | 202 | } |
113 | 203 | } |
114 | 204 | |
115 | private void processSeq(XMLElement e) | |
116 | { | |
117 | processRef(e.getAttributeNS(EpubConstants.EpubTypeNamespaceUri, "textref"), | |
118 | XRefChecker.Type.HYPERLINK); | |
205 | private void processGlobalAttrs(XMLElement e) | |
206 | { | |
207 | if (!e.getName().equals("audio")) { | |
208 | processRef(e.getAttributeNS(EpubConstants.EpubTypeNamespaceUri, "textref"), | |
209 | XRefChecker.Type.HYPERLINK); | |
210 | } | |
119 | 211 | checkType(e.getAttributeNS(EpubConstants.EpubTypeNamespaceUri, "type")); |
120 | 212 | } |
121 | 213 | |
125 | 217 | |
126 | 218 | public void endElement() |
127 | 219 | { |
220 | XMLElement e = parser.getCurrentElement(); | |
221 | String name = e.getName(); | |
222 | if (name.equals("smil")) | |
223 | { | |
224 | checkItemReferences(); | |
225 | checkProperties(); | |
226 | } | |
128 | 227 | } |
129 | 228 | |
130 | 229 | public void ignorableWhitespace(char[] chars, int arg1, int arg2) |
134 | 233 | public void processingInstruction(String arg0, String arg1) |
135 | 234 | { |
136 | 235 | } |
137 | ||
236 | ||
237 | private void checkItemReferences() { | |
238 | ||
239 | if(this.resourceRefs.isEmpty()) { | |
240 | return; | |
241 | } | |
242 | ||
243 | } | |
244 | ||
245 | private void checkFragment(String ref) { | |
246 | ||
247 | String frag = PathUtil.getFragment(ref.trim()); | |
248 | ||
249 | if (ref.indexOf("#") == -1 || Strings.isNullOrEmpty(frag)) { | |
250 | // must include a non-empty fragid | |
251 | report.message(MessageId.MED_014, EPUBLocation.create(path, parser.getLineNumber(), parser.getColumnNumber())); | |
252 | } | |
253 | } | |
254 | ||
255 | protected void checkProperties() | |
256 | { | |
257 | if (!context.ocf.isPresent()) // single file validation | |
258 | { | |
259 | return; | |
260 | } | |
261 | ||
262 | Set<ITEM_PROPERTIES> itemProps = Property.filter(context.properties, ITEM_PROPERTIES.class); | |
263 | ||
264 | for (ITEM_PROPERTIES requiredProperty : Sets.difference(requiredProperties, itemProps)) | |
265 | { | |
266 | report.message(MessageId.OPF_014, EPUBLocation.create(path), | |
267 | PackageVocabs.ITEM_VOCAB.getName(requiredProperty)); | |
268 | } | |
269 | } | |
138 | 270 | } |
0 | package com.adobe.epubcheck.overlay; | |
1 | ||
2 | import java.util.Map; | |
3 | import java.util.HashMap; | |
4 | ||
5 | public class OverlayTextChecker { | |
6 | ||
7 | private Map<String,String> refs; | |
8 | ||
9 | public OverlayTextChecker() { | |
10 | refs = new HashMap<String,String>(); | |
11 | } | |
12 | ||
13 | public boolean add(String ref, String overlay) { | |
14 | if (!refs.containsKey(ref)) { | |
15 | refs.put(ref, overlay); | |
16 | return true; | |
17 | } | |
18 | else if (!refs.get(ref).equalsIgnoreCase(overlay)) { | |
19 | return false; | |
20 | } | |
21 | return true; | |
22 | } | |
23 | ||
24 | public boolean isReferencedByOverlay(String path) { | |
25 | if (path == null || path.equals("")) { | |
26 | return false; | |
27 | } | |
28 | return refs.containsKey(path) ? true : false; | |
29 | } | |
30 | ||
31 | public boolean isCorrectOverlay(String path, String overlay) { | |
32 | return overlay.equalsIgnoreCase(refs.get(path)) ? true : false; | |
33 | } | |
34 | } |
0 | /* | |
1 | * This class was originally created by the DAISY Consortium | |
2 | * for another project, licensed under LGPL v2.1. | |
3 | * It is now integrated in EPUBCheck and relicensed under | |
4 | * EPUBCheck’s primary license. | |
5 | * See https://github.com/w3c/epubcheck/pull/1173 | |
6 | */ | |
7 | package com.adobe.epubcheck.overlay; | |
8 | ||
9 | import java.math.BigDecimal; | |
10 | import java.text.DecimalFormat; | |
11 | import java.text.DecimalFormatSymbols; | |
12 | import java.text.NumberFormat; | |
13 | import java.util.regex.Matcher; | |
14 | import java.util.regex.Pattern; | |
15 | ||
16 | /** | |
17 | * A <code>SmilClock</code> object is a wrapper for a SMIL clock value (time) | |
18 | * | |
19 | * <pre> | |
20 | * Versions: | |
21 | * 0.1.0 (09/02/2003) | |
22 | * - Implemented string parsing | |
23 | * - Implemented both toString() methods | |
24 | * 0.1.1 (10/02/2003) | |
25 | * - Added static method to get/set tolerance for equals() and compareTo() methods | |
26 | * - Modified equals() and compareTo() to take tolerance value into account | |
27 | * 0.2.0 (10/04/2003) | |
28 | * - Added support for npt= formats | |
29 | * - Fixed bug in SmilClock(double) constructor | |
30 | * - Fixed nasty bug in SmilClock(String) constructor | |
31 | * 1.0.1 (11/01/2004) | |
32 | * - Fixed bug in milliseconds parsing in SmilClock(String s); now handles values with more/less than 3 digits | |
33 | * - Fixed bug in toString(int format) that caused milliseconds to lose leading zeroes | |
34 | * 1.0.2 (11/06/2005) Markus | |
35 | * - Added optimization: patterns compiled and static | |
36 | * 1.0.3 (21/06/2005) Markus | |
37 | * - Added secondsValueRounded | |
38 | * 1.0.4 (10/02/2006) Linus | |
39 | * - Fixed locale bug in toString: now using DecimalFormat instead of NumberFormat | |
40 | * 1.0.5 (20/06/2006 Laurie | |
41 | * - Added HUMAN_READABLE static int toString(int) | |
42 | * 1.1.0 (14/11/2006) Linus | |
43 | * - Use BigDecimal instead of double to avoid rounding errors | |
44 | * </pre> | |
45 | * | |
46 | * @author James Pritchett | |
47 | */ | |
48 | public class SmilClock { | |
49 | // TODO move this to a more appropriate package | |
50 | private static Pattern fullClockPattern = Pattern | |
51 | .compile("(npt=)?(\\d+):([0-5]\\d):([0-5]\\d)([.](\\d+))?"); | |
52 | private static Pattern partialClockPattern = Pattern | |
53 | .compile("(npt=)?([0-5]\\d):([0-5]\\d)([.](\\d+))?"); | |
54 | private static Pattern timecountClockPattern = Pattern | |
55 | .compile("(npt=)?(\\d+([.]\\d+)?)(h|min|s|ms)?"); | |
56 | ||
57 | /** | |
58 | * @param s A string representation of the SMIL clock value in any accepted | |
59 | * format | |
60 | * @throws NumberFormatException if the string is not a legal SMIL clock | |
61 | * value format | |
62 | */ | |
63 | public SmilClock(String s) throws NumberFormatException { | |
64 | Matcher m; | |
65 | BigDecimal bd; | |
66 | ||
67 | /* | |
68 | * This uses regular expressions to parse the given string. It tries | |
69 | * each of the three formats (full, partial, timecount) and throws an | |
70 | * exception if none of them match. It uses regular expression groupings | |
71 | * to capture the various numeric portions of the string at parse-time, | |
72 | * which it then uses to calculate the milliseconds value. | |
73 | */ | |
74 | ||
75 | // test for timecount clock value | |
76 | m = timecountClockPattern.matcher(s.trim()); | |
77 | if (m.matches()) { | |
78 | bd = new BigDecimal(m.group(2)); // Save the number (with | |
79 | // fraction) | |
80 | if (m.group(4) == null) { | |
81 | // this.msecValue = (long)(bd.longValue() * 1000); | |
82 | // //(28/11/2006)Piotr: this one truncates fraction | |
83 | this.msecValue = new ClipTime(bd.multiply(BigDecimal.valueOf((long) 1000)) | |
84 | .longValue()); | |
85 | } else if (m.group(4).equals("ms")) { | |
86 | this.msecValue = new ClipTime(bd.doubleValue()); // NOTE: This will NOT truncate fraction | |
87 | } else if (m.group(4).equals("s")) { | |
88 | // this.msecValue = bd.multiply(new | |
89 | // BigDecimal((long)1000)).longValue(); //(28/11/2006)Piotr: the | |
90 | // construcor BigDecimal(long l) missing in java 1.4; ZedVal | |
91 | // feature | |
92 | this.msecValue = new ClipTime(bd.multiply(BigDecimal.valueOf((long) 1000)) | |
93 | .longValue()); | |
94 | } else if (m.group(4).equals("min")) { | |
95 | // this.msecValue = bd.multiply(new | |
96 | // BigDecimal((long)60000)).longValue(); //(28/11/2006)Piotr: as | |
97 | // above | |
98 | this.msecValue = new ClipTime(bd.multiply(BigDecimal.valueOf((long) (60*1000))).doubleValue()); | |
99 | } else if (m.group(4).equals("h")) { | |
100 | // this.msecValue = bd.multiply(new | |
101 | // BigDecimal((long)3600000)).longValue(); //(28/11/2006)Piotr: | |
102 | // as above | |
103 | this.msecValue = new ClipTime(bd.multiply(BigDecimal.valueOf((long) 60*60*1000)).longValue()); | |
104 | } else { | |
105 | this.msecValue = new ClipTime(); | |
106 | } | |
107 | return; | |
108 | } | |
109 | ||
110 | // test for a full clock value | |
111 | m = fullClockPattern.matcher(s.trim()); | |
112 | if (m.matches()) { | |
113 | this.msecValue = new ClipTime((Long.parseLong(m.group(2)) * 60*60*1000) | |
114 | + (Long.parseLong(m.group(3)) * 60* 1000) | |
115 | + (Long.parseLong(m.group(4)) * 1000) | |
116 | + ((m.group(6) != null) ? new BigDecimal(m.group(5)).multiply(BigDecimal.valueOf(1000)).doubleValue() : 0)); | |
117 | return; | |
118 | } | |
119 | ||
120 | // test for partial clock value | |
121 | m = partialClockPattern.matcher(s.trim()); | |
122 | if (m.matches()) { | |
123 | this.msecValue = new ClipTime((Long.parseLong(m.group(2)) * 60*1000) | |
124 | + (Long.parseLong(m.group(3)) * 1000) | |
125 | + ((m.group(5) != null) ? new BigDecimal(m.group(4)).multiply(BigDecimal.valueOf(1000)).doubleValue() : 0)); | |
126 | return; | |
127 | } | |
128 | ||
129 | // If we got this far, s is not a legal SMIL clock value | |
130 | throw new NumberFormatException("Invalid SMIL clock value format: " | |
131 | + s.trim()); | |
132 | } | |
133 | ||
134 | public SmilClock() { | |
135 | this.msecValue = new ClipTime(); | |
136 | } | |
137 | ||
138 | /** | |
139 | * @param msec Time value in milliseconds | |
140 | */ | |
141 | // public SmilClock(long msec) { | |
142 | // this.msecValue = new ClipTime(msec); | |
143 | // } | |
144 | ||
145 | private SmilClock(ClipTime clipTime) { | |
146 | this.msecValue = clipTime; | |
147 | } | |
148 | ||
149 | // public SmilClock(SmilClock toCopy) { | |
150 | // this.msecValue = toCopy.getTimeWOPrecisionLoss(); | |
151 | // } | |
152 | // | |
153 | /** | |
154 | * @param sec Time value in seconds | |
155 | */ | |
156 | public SmilClock(double sec) { | |
157 | this.msecValue = new ClipTime(sec * 1000); | |
158 | } | |
159 | ||
160 | public SmilClock addTime(SmilClock addTime) { | |
161 | return new SmilClock(this.getTimeWOPrecisionLoss().add(addTime.getTimeWOPrecisionLoss())); | |
162 | } | |
163 | ||
164 | public SmilClock subtractTime(SmilClock subtractTime) { | |
165 | return new SmilClock(this.getTimeWOPrecisionLoss().subtract(subtractTime.getTimeWOPrecisionLoss())); | |
166 | } | |
167 | ||
168 | /** | |
169 | * | |
170 | * Just for compability, broken by design really | |
171 | * | |
172 | * The SmilClock should only be initialized by values of "seconds", another basic type, | |
173 | * implying another unit type, milliseconds is way to dangerous! | |
174 | * @param msec Time value in milliseconds | |
175 | */ | |
176 | @Deprecated | |
177 | public SmilClock(long msec) { | |
178 | this.msecValue = new ClipTime(msec); | |
179 | } | |
180 | ||
181 | //public void setToMiliseconds(double msec) { | |
182 | // this.msecValue = new ClipTime(msec); | |
183 | //} | |
184 | ||
185 | public boolean notSet() { | |
186 | return this.msecValue.notSet(); | |
187 | } | |
188 | ||
189 | ||
190 | /** | |
191 | * Returns clock value in full clock value format (default) | |
192 | * | |
193 | * @return String in full clock value format (HH:MM:SS.mmm) | |
194 | */ | |
195 | @Override | |
196 | public String toString() { | |
197 | return this.toString(SmilClock.FULL); | |
198 | } | |
199 | ||
200 | /** | |
201 | * Returns clock value in specified format | |
202 | * | |
203 | * @param format Format code (FULL, PARTIAL, TIMECOUNT) | |
204 | * @return String with value in named format | |
205 | */ | |
206 | public String toString(int format) { | |
207 | long hr; | |
208 | long min; | |
209 | long sec; | |
210 | double msec; | |
211 | long tmp; | |
212 | ||
213 | String s; | |
214 | ||
215 | NumberFormat nfInt = NumberFormat.getIntegerInstance(); | |
216 | nfInt.setMinimumIntegerDigits(2); | |
217 | NumberFormat nfMsec = NumberFormat.getIntegerInstance(); | |
218 | nfMsec.setMinimumIntegerDigits(3); | |
219 | DecimalFormatSymbols dfSymbols = new DecimalFormatSymbols(); | |
220 | dfSymbols.setDecimalSeparator('.'); | |
221 | DecimalFormat dfDouble = new DecimalFormat("0.000", dfSymbols); | |
222 | dfDouble.setMaximumFractionDigits(3); | |
223 | dfDouble.setGroupingUsed(false); | |
224 | ||
225 | // Break out all the pieces ... | |
226 | msec = this.msecValue.roundedToMilliSeconds().getTimeInMs() % 1000; | |
227 | tmp = (Math.round(this.msecValue.getTimeInMs() - msec)) / 1000; | |
228 | sec = tmp % 60; | |
229 | tmp = (tmp - sec) / 60; | |
230 | min = tmp % 60; | |
231 | hr = (tmp - min) / 60; | |
232 | ||
233 | switch (format) { | |
234 | case FULL: | |
235 | if (msec > 0) { | |
236 | s = hr + ":" + nfInt.format(min) + ":" + nfInt.format(sec) | |
237 | + "." + nfMsec.format(msec); | |
238 | } else { | |
239 | s = hr + ":" + nfInt.format(min) + ":" + nfInt.format(sec); | |
240 | } | |
241 | break; | |
242 | case PARTIAL: | |
243 | // TODO : Comment probably wrong! (Comment older than "previous" code..?? | |
244 | // KNOWN BUG: This will return misleading results for clock values > | |
245 | // 59:59.999 | |
246 | // WORK AROUND: Caller is responsible for testing that this is an | |
247 | // appropriate format | |
248 | if (msec > 0) { | |
249 | s = nfInt.format(min) + ":" + nfInt.format(sec) + "." | |
250 | + nfMsec.format(msec); | |
251 | } else { | |
252 | s = nfInt.format(min) + ":" + nfInt.format(sec); | |
253 | } | |
254 | break; | |
255 | case TIMECOUNT: | |
256 | s = dfDouble.format(BigDecimal.valueOf(this.msecValue.getTimeInMs() / 1000)); | |
257 | break; | |
258 | case TIMECOUNT_MSEC: | |
259 | s = dfDouble.format(BigDecimal.valueOf(this.msecValue.getTimeInMs())) + "ms"; | |
260 | break; | |
261 | case RAW_TIMECOUNT_TRUNCATED_MSC: | |
262 | s = Long.toString((long) Math.ceil(this.msecValue.getTimeInMs())); | |
263 | break; | |
264 | case TIMECOUNT_SEC: | |
265 | s = dfDouble.format(BigDecimal.valueOf(this.msecValue.getTimeInMs() / 1000)) + "s"; | |
266 | break; | |
267 | case TIMECOUNT_MIN: | |
268 | s = dfDouble.format(BigDecimal.valueOf(this.msecValue.getTimeInMs() / (1000*60))) + "min"; | |
269 | break; | |
270 | case TIMECOUNT_HR: | |
271 | s = dfDouble.format(BigDecimal.valueOf(this.msecValue.getTimeInMs() / (1000*60*60))) + "h"; | |
272 | break; | |
273 | case HUMAN_READABLE: | |
274 | if (hr > 0) { | |
275 | s = hr + " h " + nfInt.format(min) + " min "; | |
276 | } else if (min > 0) { | |
277 | s = nfInt.format(min) + " min " + nfInt.format(sec) + " s"; | |
278 | } else if (sec > 0) { | |
279 | s = nfInt.format(sec) + " s " + nfMsec.format(msec) + " ms"; | |
280 | } else { | |
281 | s = nfMsec.format(msec) + " ms"; | |
282 | } | |
283 | break; | |
284 | default: | |
285 | throw new NumberFormatException("Unknown SMIL clock format code: " | |
286 | + format); | |
287 | } | |
288 | return s; | |
289 | } | |
290 | ||
291 | /** | |
292 | * Returns clock value in milliseconds | |
293 | */ | |
294 | private ClipTime getTimeWOPrecisionLoss() { | |
295 | return this.msecValue; | |
296 | } | |
297 | ||
298 | /** | |
299 | * | |
300 | * Just for compability, broken by design really | |
301 | * | |
302 | * @return clock value in milliseconds | |
303 | */ | |
304 | @Deprecated | |
305 | public long millisecondsValue() { | |
306 | return (long) millisecondsValueAsLong(); | |
307 | } | |
308 | ||
309 | public long millisecondsValueAsLong() { | |
310 | return Math.round(this.msecValue.roundedToMilliSeconds().getTimeInMs()); | |
311 | } | |
312 | ||
313 | /** | |
314 | * | |
315 | * Just for compability, broken by design really | |
316 | * | |
317 | * Enhance type system even further, get rid of log/double altogheter and use some class "Seconds" instead! | |
318 | * @return | |
319 | */ | |
320 | @Deprecated | |
321 | public long secondsValueRounded() { | |
322 | return Math.round(this.secondsValue()); | |
323 | } | |
324 | ||
325 | /** | |
326 | * Returns clock value in seconds | |
327 | * | |
328 | * @return clock value in seconds | |
329 | */ | |
330 | public double secondsValue() { | |
331 | // return new | |
332 | // BigDecimal(this.msecValue).divide(BigDecimal.valueOf(1000)).doubleValue(); | |
333 | // //(28/11/2006)PK: BigDecimal#divide(BigDecimal bd) not in java 1.4; | |
334 | // ZedVal feature | |
335 | return (double) this.msecValue.getTimeInMs() / 1000; | |
336 | } | |
337 | ||
338 | /** | |
339 | * Returns clock value in seconds, rounded to full seconds | |
340 | * | |
341 | * @return clock value in seconds, rounded to full seconds | |
342 | */ | |
343 | // public long secondsValueRounded() { | |
344 | // return Math.round(this.secondsValue()); | |
345 | // } | |
346 | ||
347 | public double secondsValueRoundedDouble() { | |
348 | return Math.round(this.secondsValue()); | |
349 | } | |
350 | ||
351 | public SmilClock roundToMSPrecision() { | |
352 | return new SmilClock(this.getTimeWOPrecisionLoss().roundedToMilliSeconds()); | |
353 | } | |
354 | ||
355 | public SmilClock floorToMSPrecision() { | |
356 | return new SmilClock(this.getTimeWOPrecisionLoss().floorToMilliSeconds()); | |
357 | } | |
358 | ||
359 | // FIXME Hashcode not implemented, should come in pair with "equals" | |
360 | ||
361 | // implement equals() so we can test values for equality | |
362 | @Override | |
363 | public boolean equals(Object otherObject) { | |
364 | if (this == otherObject) | |
365 | return true; // Objects are identical | |
366 | if (otherObject == null) | |
367 | return false; // There ain't nuthin' like a null ... | |
368 | if (getClass() != otherObject.getClass()) | |
369 | return false; // No class-mixing, either | |
370 | try { | |
371 | SmilClock other = (SmilClock) otherObject; // Cast it, then | |
372 | // compare, using | |
373 | // tolerance | |
374 | return eqWithinTolerance(other, msecTolerance); | |
375 | } catch (ClassCastException cce) { | |
376 | // do nothing | |
377 | } | |
378 | return false; | |
379 | } | |
380 | ||
381 | public boolean eqWithinTolerance(SmilClock other, long msecTolerance) { | |
382 | if (compareTo(other, msecTolerance) == 0) { | |
383 | return true; | |
384 | } else { | |
385 | return false; | |
386 | } | |
387 | } | |
388 | ||
389 | // implement Comparable interface so we can sort and compare values | |
390 | public int compareTo(Object otherObject) throws ClassCastException { | |
391 | return compareTo(otherObject, getTolerance()); | |
392 | } | |
393 | ||
394 | public int compareTo(Object otherObject, long msecTolerance) throws ClassCastException { | |
395 | SmilClock other = (SmilClock) otherObject; // Hope for the best! | |
396 | if (Math.abs(other.msecValue.getTimeInMs() - this.msecValue.getTimeInMs()) <= msecTolerance) { | |
397 | return 0; | |
398 | } | |
399 | if (this.msecValue.getTimeInMs() < other.msecValue.getTimeInMs()) { | |
400 | return -1; | |
401 | } | |
402 | return 1; | |
403 | } | |
404 | ||
405 | // Static methods | |
406 | ||
407 | /** | |
408 | * Sets tolerance for comparisons and equality testing. | |
409 | * <p> | |
410 | * When comparing two values, if they differ by less than the given | |
411 | * tolerance, they will be evaluated as equal to one another. | |
412 | * </p> | |
413 | * | |
414 | * @param msec Tolerance value in milliseconds | |
415 | */ | |
416 | public static void setTolerance(long msec) { | |
417 | msecTolerance = msec; | |
418 | } | |
419 | ||
420 | /** | |
421 | * Returns tolerance setting | |
422 | * | |
423 | * @return Current tolerance value in milliseconds | |
424 | */ | |
425 | public static long getTolerance() { | |
426 | return msecTolerance; | |
427 | } | |
428 | ||
429 | // Type codes for the different SMIL clock value formats | |
430 | public static final int FULL = 1; | |
431 | public static final int PARTIAL = 2; | |
432 | public static final int TIMECOUNT = 3; // Default version (no metric) | |
433 | public static final int TIMECOUNT_MSEC = 4; | |
434 | public static final int TIMECOUNT_SEC = 5; | |
435 | public static final int TIMECOUNT_MIN = 6; | |
436 | public static final int TIMECOUNT_HR = 7; | |
437 | public static final int HUMAN_READABLE = 8; | |
438 | public static final int RAW_TIMECOUNT_TRUNCATED_MSC = 9; | |
439 | ||
440 | private final ClipTime msecValue; // All values stored in milliseconds | |
441 | private static long msecTolerance; | |
442 | } |
32 | 32 | private int nWarning = 0; |
33 | 33 | @JsonProperty |
34 | 34 | private int nUsage = 0; |
35 | ||
36 | private final String workingDirectory = System.getProperty("user.dir"); | |
37 | 35 | |
38 | 36 | public void setFileInfo(File epubFile) |
39 | 37 | { |
42 | 42 | { |
43 | 43 | |
44 | 44 | private static final String BUNDLE_NAME = "com.adobe.epubcheck.util.messages"; |
45 | private static final Table<String, String, Messages> messageTable = HashBasedTable.create(); | |
45 | private static final Table<String, Locale, Messages> messageTable = HashBasedTable.create(); | |
46 | 46 | |
47 | 47 | private ResourceBundle bundle; |
48 | 48 | private Locale locale; |
85 | 85 | locale = (locale == null) ? Locale.getDefault() : locale; |
86 | 86 | |
87 | 87 | String bundleKey = (cls==null)? BUNDLE_NAME : getBundleName(cls); |
88 | String localeKey = locale.getLanguage(); | |
89 | if (messageTable.contains(bundleKey, localeKey)) { | |
90 | instance = messageTable.get(bundleKey, localeKey); | |
88 | if (messageTable.contains(bundleKey, locale)) { | |
89 | instance = messageTable.get(bundleKey, locale); | |
91 | 90 | } |
92 | 91 | else |
93 | 92 | { |
96 | 95 | if (instance == null) |
97 | 96 | { |
98 | 97 | instance = new Messages(locale, bundleKey); |
99 | messageTable.put(bundleKey, localeKey, instance); | |
98 | messageTable.put(bundleKey, locale, instance); | |
100 | 99 | } |
101 | 100 | } |
102 | 101 | } |
38 | 38 | // This class should probably be entirely refactored at some point |
39 | 39 | public class PathUtil |
40 | 40 | { |
41 | static final String workingDirectory = System.getProperty("user.dir"); | |
42 | 41 | |
43 | 42 | private static final Pattern REGEX_URI_SCHEME = Pattern |
44 | 43 | .compile("^\\p{Alpha}(\\p{Alnum}|\\.|\\+|-)*:"); |
150 | 149 | { |
151 | 150 | return path; |
152 | 151 | } |
153 | return path.replace(workingDirectory, "."); | |
152 | String workingDirectory = System.getProperty("user.dir"); | |
153 | if ("/".equals(workingDirectory) || !path.startsWith(workingDirectory)) { | |
154 | return path; | |
155 | } | |
156 | return ".".concat(path.substring(workingDirectory.length())); | |
154 | 157 | } |
155 | 158 | |
156 | 159 | public static String getFragment(String uri) |
0 | 0 | Copyright (c) {{YEAR}}, {{OWNER}} |
1 | 1 | All rights reserved. |
2 | 2 | |
3 | Redistribution and use in source and binary forms, with or without | |
4 | modification, are permitted provided that the following conditions are met: | |
3 | Redistribution and use in source and binary forms, with or without modification, | |
4 | are permitted provided that the following conditions are met: | |
5 | 5 | |
6 | Redistributions of source code must retain the above copyright notice, this | |
7 | list of conditions and the following disclaimer. | |
6 | Redistributions of source code must retain the above copyright notice, this list | |
7 | of conditions and the following disclaimer. | |
8 | 8 | |
9 | 9 | Redistributions in binary form must reproduce the above copyright notice, this |
10 | 10 | list of conditions and the following disclaimer in the documentation and/or |
17 | 17 | THIS SOFTWARE IS PROVIDED BY {{THE COPYRIGHT HOLDERS AND CONTRIBUTORS}} "AS IS" |
18 | 18 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE |
19 | 19 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE |
20 | DISCLAIMED. IN NO EVENT SHALL {{THE COPYRIGHT HOLDER OR CONTRIBUTORS}} BE | |
21 | LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR | |
22 | CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE | |
23 | GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) | |
24 | HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT | |
25 | LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT | |
26 | OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.⏎ | |
20 | DISCLAIMED. IN NO EVENT SHALL {{THE COPYRIGHT HOLDER OR CONTRIBUTORS}} BE LIABLE | |
21 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL | |
22 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR | |
23 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER | |
24 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR | |
25 | TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF | |
26 | THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.⏎ |
0 | Copyright <YEAR> <COPYRIGHT HOLDER> | |
1 | ||
2 | Permission is hereby granted, free of charge, to any person obtaining a copy of | |
3 | this software and associated documentation files (the "Software"), to deal in | |
4 | the Software without restriction, including without limitation the rights to | |
5 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of | |
6 | the Software, and to permit persons to whom the Software is furnished to do so, | |
7 | subject to the following conditions: | |
8 | ||
9 | The above copyright notice and this permission notice shall be included in all | |
10 | copies or substantial portions of the Software. | |
11 | ||
12 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR | |
13 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS | |
14 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR | |
15 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER | |
16 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN | |
17 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. | |
18 |
0 | MOZILLA PUBLIC LICENSE | |
1 | Version 1.0 | |
2 | ||
3 | ---------------- | |
4 | ||
5 | 1. Definitions. | |
6 | ||
7 | 1.1. ``Contributor'' means each entity that creates or contributes to | |
8 | the creation of Modifications. | |
9 | ||
10 | 1.2. ``Contributor Version'' means the combination of the Original | |
11 | Code, prior Modifications used by a Contributor, and the Modifications | |
12 | made by that particular Contributor. | |
13 | ||
14 | 1.3. ``Covered Code'' means the Original Code or Modifications or the | |
15 | combination of the Original Code and Modifications, in each case | |
16 | including portions thereof. | |
17 | ||
18 | 1.4. ``Electronic Distribution Mechanism'' means a mechanism generally | |
19 | accepted in the software development community for the electronic | |
20 | transfer of data. | |
21 | ||
22 | 1.5. ``Executable'' means Covered Code in any form other than Source | |
23 | Code. | |
24 | ||
25 | 1.6. ``Initial Developer'' means the individual or entity identified as | |
26 | the Initial Developer in the Source Code notice required by Exhibit A. | |
27 | ||
28 | 1.7. ``Larger Work'' means a work which combines Covered Code or | |
29 | portions thereof with code not governed by the terms of this License. | |
30 | ||
31 | 1.8. ``License'' means this document. | |
32 | ||
33 | 1.9. ``Modifications'' means any addition to or deletion from the | |
34 | substance or structure of either the Original Code or any previous | |
35 | Modifications. When Covered Code is released as a series of files, a | |
36 | Modification is: | |
37 | ||
38 | A. Any addition to or deletion from the contents of a file | |
39 | containing Original Code or previous Modifications. | |
40 | ||
41 | B. Any new file that contains any part of the Original Code or | |
42 | previous Modifications. | |
43 | ||
44 | 1.10. ``Original Code'' means Source Code of computer software code | |
45 | which is described in the Source Code notice required by Exhibit A as | |
46 | Original Code, and which, at the time of its release under this License | |
47 | is not already Covered Code governed by this License. | |
48 | ||
49 | 1.11. ``Source Code'' means the preferred form of the Covered Code for | |
50 | making modifications to it, including all modules it contains, plus any | |
51 | associated interface definition files, scripts used to control | |
52 | compilation and installation of an Executable, or a list of source code | |
53 | differential comparisons against either the Original Code or another | |
54 | well known, available Covered Code of the Contributor's choice. The | |
55 | Source Code can be in a compressed or archival form, provided the | |
56 | appropriate decompression or de-archiving software is widely available | |
57 | for no charge. | |
58 | ||
59 | 1.12. ``You'' means an individual or a legal entity exercising rights | |
60 | under, and complying with all of the terms of, this License or a future | |
61 | version of this License issued under Section 6.1. For legal entities, | |
62 | ``You'' includes any entity which controls, is controlled by, or is | |
63 | under common control with You. For purposes of this definition, | |
64 | ``control'' means (a) the power, direct or indirect, to cause the | |
65 | direction or management of such entity, whether by contract or | |
66 | otherwise, or (b) ownership of fifty percent (50%) or more of the | |
67 | outstanding shares or beneficial ownership of such entity. | |
68 | ||
69 | 2. Source Code License. | |
70 | ||
71 | 2.1. The Initial Developer Grant. | |
72 | The Initial Developer hereby grants You a world-wide, royalty-free, | |
73 | non-exclusive license, subject to third party intellectual property | |
74 | claims: | |
75 | ||
76 | (a) to use, reproduce, modify, display, perform, sublicense and | |
77 | distribute the Original Code (or portions thereof) with or without | |
78 | Modifications, or as part of a Larger Work; and | |
79 | ||
80 | (b) under patents now or hereafter owned or controlled by Initial | |
81 | Developer, to make, have made, use and sell (``Utilize'') the | |
82 | Original Code (or portions thereof), but solely to the extent that | |
83 | any such patent is reasonably necessary to enable You to Utilize | |
84 | the Original Code (or portions thereof) and not to any greater | |
85 | extent that may be necessary to Utilize further Modifications or | |
86 | combinations. | |
87 | ||
88 | 2.2. Contributor Grant. | |
89 | Each Contributor hereby grants You a world-wide, royalty-free, | |
90 | non-exclusive license, subject to third party intellectual property | |
91 | claims: | |
92 | ||
93 | (a) to use, reproduce, modify, display, perform, sublicense and | |
94 | distribute the Modifications created by such Contributor (or | |
95 | portions thereof) either on an unmodified basis, with other | |
96 | Modifications, as Covered Code or as part of a Larger Work; and | |
97 | ||
98 | (b) under patents now or hereafter owned or controlled by | |
99 | Contributor, to Utilize the Contributor Version (or portions | |
100 | thereof), but solely to the extent that any such patent is | |
101 | reasonably necessary to enable You to Utilize the Contributor | |
102 | Version (or portions thereof), and not to any greater extent that | |
103 | may be necessary to Utilize further Modifications or combinations. | |
104 | ||
105 | 3. Distribution Obligations. | |
106 | ||
107 | 3.1. Application of License. | |
108 | The Modifications which You create or to which You contribute are | |
109 | governed by the terms of this License, including without limitation | |
110 | Section 2.2. The Source Code version of Covered Code may be distributed | |
111 | only under the terms of this License or a future version of this | |
112 | License released under Section 6.1, and You must include a copy of this | |
113 | License with every copy of the Source Code You distribute. You may not | |
114 | offer or impose any terms on any Source Code version that alters or | |
115 | restricts the applicable version of this License or the recipients' | |
116 | rights hereunder. However, You may include an additional document | |
117 | offering the additional rights described in Section 3.5. | |
118 | ||
119 | 3.2. Availability of Source Code. | |
120 | Any Modification which You create or to which You contribute must be | |
121 | made available in Source Code form under the terms of this License | |
122 | either on the same media as an Executable version or via an accepted | |
123 | Electronic Distribution Mechanism to anyone to whom you made an | |
124 | Executable version available; and if made available via Electronic | |
125 | Distribution Mechanism, must remain available for at least twelve (12) | |
126 | months after the date it initially became available, or at least six | |
127 | (6) months after a subsequent version of that particular Modification | |
128 | has been made available to such recipients. You are responsible for | |
129 | ensuring that the Source Code version remains available even if the | |
130 | Electronic Distribution Mechanism is maintained by a third party. | |
131 | ||
132 | 3.3. Description of Modifications. | |
133 | You must cause all Covered Code to which you contribute to contain a | |
134 | file documenting the changes You made to create that Covered Code and | |
135 | the date of any change. You must include a prominent statement that the | |
136 | Modification is derived, directly or indirectly, from Original Code | |
137 | provided by the Initial Developer and including the name of the Initial | |
138 | Developer in (a) the Source Code, and (b) in any notice in an | |
139 | Executable version or related documentation in which You describe the | |
140 | origin or ownership of the Covered Code. | |
141 | ||
142 | 3.4. Intellectual Property Matters | |
143 | ||
144 | (a) Third Party Claims. | |
145 | If You have knowledge that a party claims an intellectual property | |
146 | right in particular functionality or code (or its utilization | |
147 | under this License), you must include a text file with the source | |
148 | code distribution titled ``LEGAL'' which describes the claim and | |
149 | the party making the claim in sufficient detail that a recipient | |
150 | will know whom to contact. If you obtain such knowledge after You | |
151 | make Your Modification available as described in Section 3.2, You | |
152 | shall promptly modify the LEGAL file in all copies You make | |
153 | available thereafter and shall take other steps (such as notifying | |
154 | appropriate mailing lists or newsgroups) reasonably calculated to | |
155 | inform those who received the Covered Code that new knowledge has | |
156 | been obtained. | |
157 | ||
158 | (b) Contributor APIs. | |
159 | If Your Modification is an application programming interface and | |
160 | You own or control patents which are reasonably necessary to | |
161 | implement that API, you must also include this information in the | |
162 | LEGAL file. | |
163 | ||
164 | 3.5. Required Notices. | |
165 | You must duplicate the notice in Exhibit A in each file of the Source | |
166 | Code, and this License in any documentation for the Source Code, where | |
167 | You describe recipients' rights relating to Covered Code. If You | |
168 | created one or more Modification(s), You may add your name as a | |
169 | Contributor to the notice described in Exhibit A. If it is not possible | |
170 | to put such notice in a particular Source Code file due to its | |
171 | structure, then you must include such notice in a location (such as a | |
172 | relevant directory file) where a user would be likely to look for such | |
173 | a notice. You may choose to offer, and to charge a fee for, warranty, | |
174 | support, indemnity or liability obligations to one or more recipients | |
175 | of Covered Code. However, You may do so only on Your own behalf, and | |
176 | not on behalf of the Initial Developer or any Contributor. You must | |
177 | make it absolutely clear than any such warranty, support, indemnity or | |
178 | liability obligation is offered by You alone, and You hereby agree to | |
179 | indemnify the Initial Developer and every Contributor for any liability | |
180 | incurred by the Initial Developer or such Contributor as a result of | |
181 | warranty, support, indemnity or liability terms You offer. | |
182 | ||
183 | 3.6. Distribution of Executable Versions. | |
184 | You may distribute Covered Code in Executable form only if the | |
185 | requirements of Section 3.1-3.5 have been met for that Covered Code, | |
186 | and if You include a notice stating that the Source Code version of the | |
187 | Covered Code is available under the terms of this License, including a | |
188 | description of how and where You have fulfilled the obligations of | |
189 | Section 3.2. The notice must be conspicuously included in any notice in | |
190 | an Executable version, related documentation or collateral in which You | |
191 | describe recipients' rights relating to the Covered Code. You may | |
192 | distribute the Executable version of Covered Code under a license of | |
193 | Your choice, which may contain terms different from this License, | |
194 | provided that You are in compliance with the terms of this License and | |
195 | that the license for the Executable version does not attempt to limit | |
196 | or alter the recipient's rights in the Source Code version from the | |
197 | rights set forth in this License. If You distribute the Executable | |
198 | version under a different license You must make it absolutely clear | |
199 | that any terms which differ from this License are offered by You alone, | |
200 | not by the Initial Developer or any Contributor. You hereby agree to | |
201 | indemnify the Initial Developer and every Contributor for any liability | |
202 | incurred by the Initial Developer or such Contributor as a result of | |
203 | any such terms You offer. | |
204 | ||
205 | 3.7. Larger Works. | |
206 | You may create a Larger Work by combining Covered Code with other code | |
207 | not governed by the terms of this License and distribute the Larger | |
208 | Work as a single product. In such a case, You must make sure the | |
209 | requirements of this License are fulfilled for the Covered Code. | |
210 | ||
211 | 4. Inability to Comply Due to Statute or Regulation. | |
212 | ||
213 | If it is impossible for You to comply with any of the terms of this | |
214 | License with respect to some or all of the Covered Code due to statute | |
215 | or regulation then You must: (a) comply with the terms of this License | |
216 | to the maximum extent possible; and (b) describe the limitations and | |
217 | the code they affect. Such description must be included in the LEGAL | |
218 | file described in Section 3.4 and must be included with all | |
219 | distributions of the Source Code. Except to the extent prohibited by | |
220 | statute or regulation, such description must be sufficiently detailed | |
221 | for a recipient of ordinary skill to be able to understand it. | |
222 | ||
223 | 5. Application of this License. | |
224 | ||
225 | This License applies to code to which the Initial Developer has | |
226 | attached the notice in Exhibit A, and to related Covered Code. | |
227 | ||
228 | 6. Versions of the License. | |
229 | ||
230 | 6.1. New Versions. | |
231 | Netscape Communications Corporation (``Netscape'') may publish revised | |
232 | and/or new versions of the License from time to time. Each version will | |
233 | be given a distinguishing version number. | |
234 | ||
235 | 6.2. Effect of New Versions. | |
236 | Once Covered Code has been published under a particular version of the | |
237 | License, You may always continue to use it under the terms of that | |
238 | version. You may also choose to use such Covered Code under the terms | |
239 | of any subsequent version of the License published by Netscape. No one | |
240 | other than Netscape has the right to modify the terms applicable to | |
241 | Covered Code created under this License. | |
242 | ||
243 | 6.3. Derivative Works. | |
244 | If you create or use a modified version of this License (which you may | |
245 | only do in order to apply it to code which is not already Covered Code | |
246 | governed by this License), you must (a) rename Your license so that the | |
247 | phrases ``Mozilla'', ``MOZILLAPL'', ``MOZPL'', ``Netscape'', ``NPL'' or | |
248 | any confusingly similar phrase do not appear anywhere in your license | |
249 | and (b) otherwise make it clear that your version of the license | |
250 | contains terms which differ from the Mozilla Public License and | |
251 | Netscape Public License. (Filling in the name of the Initial Developer, | |
252 | Original Code or Contributor in the notice described in Exhibit A shall | |
253 | not of themselves be deemed to be modifications of this License.) | |
254 | ||
255 | 7. DISCLAIMER OF WARRANTY. | |
256 | ||
257 | COVERED CODE IS PROVIDED UNDER THIS LICENSE ON AN ``AS IS'' BASIS, | |
258 | WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, | |
259 | WITHOUT LIMITATION, WARRANTIES THAT THE COVERED CODE IS FREE OF | |
260 | DEFECTS, MERCHANTABLE, FIT FOR A PARTICULAR PURPOSE OR NON-INFRINGING. | |
261 | THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE COVERED CODE | |
262 | IS WITH YOU. SHOULD ANY COVERED CODE PROVE DEFECTIVE IN ANY RESPECT, | |
263 | YOU (NOT THE INITIAL DEVELOPER OR ANY OTHER CONTRIBUTOR) ASSUME THE | |
264 | COST OF ANY NECESSARY SERVICING, REPAIR OR CORRECTION. THIS DISCLAIMER | |
265 | OF WARRANTY CONSTITUTES AN ESSENTIAL PART OF THIS LICENSE. NO USE OF | |
266 | ANY COVERED CODE IS AUTHORIZED HEREUNDER EXCEPT UNDER THIS DISCLAIMER. | |
267 | ||
268 | 8. TERMINATION. | |
269 | ||
270 | This License and the rights granted hereunder will terminate | |
271 | automatically if You fail to comply with terms herein and fail to cure | |
272 | such breach within 30 days of becoming aware of the breach. All | |
273 | sublicenses to the Covered Code which are properly granted shall | |
274 | survive any termination of this License. Provisions which, by their | |
275 | nature, must remain in effect beyond the termination of this License | |
276 | shall survive. | |
277 | ||
278 | 9. LIMITATION OF LIABILITY. | |
279 | ||
280 | UNDER NO CIRCUMSTANCES AND UNDER NO LEGAL THEORY, WHETHER TORT | |
281 | (INCLUDING NEGLIGENCE), CONTRACT, OR OTHERWISE, SHALL THE INITIAL | |
282 | DEVELOPER, ANY OTHER CONTRIBUTOR, OR ANY DISTRIBUTOR OF COVERED CODE, | |
283 | OR ANY SUPPLIER OF ANY OF SUCH PARTIES, BE LIABLE TO YOU OR ANY OTHER | |
284 | PERSON FOR ANY INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES | |
285 | OF ANY CHARACTER INCLUDING, WITHOUT LIMITATION, DAMAGES FOR LOSS OF | |
286 | GOODWILL, WORK STOPPAGE, COMPUTER FAILURE OR MALFUNCTION, OR ANY AND | |
287 | ALL OTHER COMMERCIAL DAMAGES OR LOSSES, EVEN IF SUCH PARTY SHALL HAVE | |
288 | BEEN INFORMED OF THE POSSIBILITY OF SUCH DAMAGES. THIS LIMITATION OF | |
289 | LIABILITY SHALL NOT APPLY TO LIABILITY FOR DEATH OR PERSONAL INJURY | |
290 | RESULTING FROM SUCH PARTY'S NEGLIGENCE TO THE EXTENT APPLICABLE LAW | |
291 | PROHIBITS SUCH LIMITATION. SOME JURISDICTIONS DO NOT ALLOW THE | |
292 | EXCLUSION OR LIMITATION OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THAT | |
293 | EXCLUSION AND LIMITATION MAY NOT APPLY TO YOU. | |
294 | ||
295 | 10. U.S. GOVERNMENT END USERS. | |
296 | ||
297 | The Covered Code is a ``commercial item,'' as that term is defined in | |
298 | 48 C.F.R. 2.101 (Oct. 1995), consisting of ``commercial computer | |
299 | software'' and ``commercial computer software documentation,'' as such | |
300 | terms are used in 48 C.F.R. 12.212 (Sept. 1995). Consistent with 48 | |
301 | C.F.R. 12.212 and 48 C.F.R. 227.7202-1 through 227.7202-4 (June 1995), | |
302 | all U.S. Government End Users acquire Covered Code with only those | |
303 | rights set forth herein. | |
304 | ||
305 | 11. MISCELLANEOUS. | |
306 | ||
307 | This License represents the complete agreement concerning subject | |
308 | matter hereof. If any provision of this License is held to be | |
309 | unenforceable, such provision shall be reformed only to the extent | |
310 | necessary to make it enforceable. This License shall be governed by | |
311 | California law provisions (except to the extent applicable law, if any, | |
312 | provides otherwise), excluding its conflict-of-law provisions. With | |
313 | respect to disputes in which at least one party is a citizen of, or an | |
314 | entity chartered or registered to do business in, the United States of | |
315 | America: (a) unless otherwise agreed in writing, all disputes relating | |
316 | to this License (excepting any dispute relating to intellectual | |
317 | property rights) shall be subject to final and binding arbitration, | |
318 | with the losing party paying all costs of arbitration; (b) any | |
319 | arbitration relating to this Agreement shall be held in Santa Clara | |
320 | County, California, under the auspices of JAMS/EndDispute; and (c) any | |
321 | litigation relating to this Agreement shall be subject to the | |
322 | jurisdiction of the Federal Courts of the Northern District of | |
323 | California, with venue lying in Santa Clara County, California, with | |
324 | the losing party responsible for costs, including without limitation, | |
325 | court costs and reasonable attorneys fees and expenses. The application | |
326 | of the United Nations Convention on Contracts for the International | |
327 | Sale of Goods is expressly excluded. Any law or regulation which | |
328 | provides that the language of a contract shall be construed against the | |
329 | drafter shall not apply to this License. | |
330 | ||
331 | 12. RESPONSIBILITY FOR CLAIMS. | |
332 | ||
333 | Except in cases where another Contributor has failed to comply with | |
334 | Section 3.4, You are responsible for damages arising, directly or | |
335 | indirectly, out of Your utilization of rights under this License, based | |
336 | on the number of copies of Covered Code you made available, the | |
337 | revenues you received from utilizing such rights, and other relevant | |
338 | factors. You agree to work with affected parties to distribute | |
339 | responsibility on an equitable basis. | |
340 | ||
341 | EXHIBIT A. | |
342 | ||
343 | ``The contents of this file are subject to the Mozilla Public License | |
344 | Version 1.0 (the "License"); you may not use this file except in | |
345 | compliance with the License. You may obtain a copy of the License at | |
346 | http://www.mozilla.org/MPL/ | |
347 | ||
348 | Software distributed under the License is distributed on an "AS IS" | |
349 | basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See the | |
350 | License for the specific language governing rights and limitations | |
351 | under the License. | |
352 | ||
353 | The Original Code is ______________________________________. | |
354 | ||
355 | The Initial Developer of the Original Code is ________________________. | |
356 | Portions created by ______________________ are Copyright (C) ______ | |
357 | _______________________. All Rights Reserved. | |
358 | ||
359 | Contributor(s): ______________________________________.''⏎ |
0 | Mozilla Public License Version 2.0 | |
1 | ================================== | |
2 | ||
3 | 1. Definitions | |
4 | -------------- | |
5 | ||
6 | 1.1. "Contributor" | |
7 | means each individual or legal entity that creates, contributes to | |
8 | the creation of, or owns Covered Software. | |
9 | ||
10 | 1.2. "Contributor Version" | |
11 | means the combination of the Contributions of others (if any) used | |
12 | by a Contributor and that particular Contributor's Contribution. | |
13 | ||
14 | 1.3. "Contribution" | |
15 | means Covered Software of a particular Contributor. | |
16 | ||
17 | 1.4. "Covered Software" | |
18 | means Source Code Form to which the initial Contributor has attached | |
19 | the notice in Exhibit A, the Executable Form of such Source Code | |
20 | Form, and Modifications of such Source Code Form, in each case | |
21 | including portions thereof. | |
22 | ||
23 | 1.5. "Incompatible With Secondary Licenses" | |
24 | means | |
25 | ||
26 | (a) that the initial Contributor has attached the notice described | |
27 | in Exhibit B to the Covered Software; or | |
28 | ||
29 | (b) that the Covered Software was made available under the terms of | |
30 | version 1.1 or earlier of the License, but not also under the | |
31 | terms of a Secondary License. | |
32 | ||
33 | 1.6. "Executable Form" | |
34 | means any form of the work other than Source Code Form. | |
35 | ||
36 | 1.7. "Larger Work" | |
37 | means a work that combines Covered Software with other material, in | |
38 | a separate file or files, that is not Covered Software. | |
39 | ||
40 | 1.8. "License" | |
41 | means this document. | |
42 | ||
43 | 1.9. "Licensable" | |
44 | means having the right to grant, to the maximum extent possible, | |
45 | whether at the time of the initial grant or subsequently, any and | |
46 | all of the rights conveyed by this License. | |
47 | ||
48 | 1.10. "Modifications" | |
49 | means any of the following: | |
50 | ||
51 | (a) any file in Source Code Form that results from an addition to, | |
52 | deletion from, or modification of the contents of Covered | |
53 | Software; or | |
54 | ||
55 | (b) any new file in Source Code Form that contains any Covered | |
56 | Software. | |
57 | ||
58 | 1.11. "Patent Claims" of a Contributor | |
59 | means any patent claim(s), including without limitation, method, | |
60 | process, and apparatus claims, in any patent Licensable by such | |
61 | Contributor that would be infringed, but for the grant of the | |
62 | License, by the making, using, selling, offering for sale, having | |
63 | made, import, or transfer of either its Contributions or its | |
64 | Contributor Version. | |
65 | ||
66 | 1.12. "Secondary License" | |
67 | means either the GNU General Public License, Version 2.0, the GNU | |
68 | Lesser General Public License, Version 2.1, the GNU Affero General | |
69 | Public License, Version 3.0, or any later versions of those | |
70 | licenses. | |
71 | ||
72 | 1.13. "Source Code Form" | |
73 | means the form of the work preferred for making modifications. | |
74 | ||
75 | 1.14. "You" (or "Your") | |
76 | means an individual or a legal entity exercising rights under this | |
77 | License. For legal entities, "You" includes any entity that | |
78 | controls, is controlled by, or is under common control with You. For | |
79 | purposes of this definition, "control" means (a) the power, direct | |
80 | or indirect, to cause the direction or management of such entity, | |
81 | whether by contract or otherwise, or (b) ownership of more than | |
82 | fifty percent (50%) of the outstanding shares or beneficial | |
83 | ownership of such entity. | |
84 | ||
85 | 2. License Grants and Conditions | |
86 | -------------------------------- | |
87 | ||
88 | 2.1. Grants | |
89 | ||
90 | Each Contributor hereby grants You a world-wide, royalty-free, | |
91 | non-exclusive license: | |
92 | ||
93 | (a) under intellectual property rights (other than patent or trademark) | |
94 | Licensable by such Contributor to use, reproduce, make available, | |
95 | modify, display, perform, distribute, and otherwise exploit its | |
96 | Contributions, either on an unmodified basis, with Modifications, or | |
97 | as part of a Larger Work; and | |
98 | ||
99 | (b) under Patent Claims of such Contributor to make, use, sell, offer | |
100 | for sale, have made, import, and otherwise transfer either its | |
101 | Contributions or its Contributor Version. | |
102 | ||
103 | 2.2. Effective Date | |
104 | ||
105 | The licenses granted in Section 2.1 with respect to any Contribution | |
106 | become effective for each Contribution on the date the Contributor first | |
107 | distributes such Contribution. | |
108 | ||
109 | 2.3. Limitations on Grant Scope | |
110 | ||
111 | The licenses granted in this Section 2 are the only rights granted under | |
112 | this License. No additional rights or licenses will be implied from the | |
113 | distribution or licensing of Covered Software under this License. | |
114 | Notwithstanding Section 2.1(b) above, no patent license is granted by a | |
115 | Contributor: | |
116 | ||
117 | (a) for any code that a Contributor has removed from Covered Software; | |
118 | or | |
119 | ||
120 | (b) for infringements caused by: (i) Your and any other third party's | |
121 | modifications of Covered Software, or (ii) the combination of its | |
122 | Contributions with other software (except as part of its Contributor | |
123 | Version); or | |
124 | ||
125 | (c) under Patent Claims infringed by Covered Software in the absence of | |
126 | its Contributions. | |
127 | ||
128 | This License does not grant any rights in the trademarks, service marks, | |
129 | or logos of any Contributor (except as may be necessary to comply with | |
130 | the notice requirements in Section 3.4). | |
131 | ||
132 | 2.4. Subsequent Licenses | |
133 | ||
134 | No Contributor makes additional grants as a result of Your choice to | |
135 | distribute the Covered Software under a subsequent version of this | |
136 | License (see Section 10.2) or under the terms of a Secondary License (if | |
137 | permitted under the terms of Section 3.3). | |
138 | ||
139 | 2.5. Representation | |
140 | ||
141 | Each Contributor represents that the Contributor believes its | |
142 | Contributions are its original creation(s) or it has sufficient rights | |
143 | to grant the rights to its Contributions conveyed by this License. | |
144 | ||
145 | 2.6. Fair Use | |
146 | ||
147 | This License is not intended to limit any rights You have under | |
148 | applicable copyright doctrines of fair use, fair dealing, or other | |
149 | equivalents. | |
150 | ||
151 | 2.7. Conditions | |
152 | ||
153 | Sections 3.1, 3.2, 3.3, and 3.4 are conditions of the licenses granted | |
154 | in Section 2.1. | |
155 | ||
156 | 3. Responsibilities | |
157 | ------------------- | |
158 | ||
159 | 3.1. Distribution of Source Form | |
160 | ||
161 | All distribution of Covered Software in Source Code Form, including any | |
162 | Modifications that You create or to which You contribute, must be under | |
163 | the terms of this License. You must inform recipients that the Source | |
164 | Code Form of the Covered Software is governed by the terms of this | |
165 | License, and how they can obtain a copy of this License. You may not | |
166 | attempt to alter or restrict the recipients' rights in the Source Code | |
167 | Form. | |
168 | ||
169 | 3.2. Distribution of Executable Form | |
170 | ||
171 | If You distribute Covered Software in Executable Form then: | |
172 | ||
173 | (a) such Covered Software must also be made available in Source Code | |
174 | Form, as described in Section 3.1, and You must inform recipients of | |
175 | the Executable Form how they can obtain a copy of such Source Code | |
176 | Form by reasonable means in a timely manner, at a charge no more | |
177 | than the cost of distribution to the recipient; and | |
178 | ||
179 | (b) You may distribute such Executable Form under the terms of this | |
180 | License, or sublicense it under different terms, provided that the | |
181 | license for the Executable Form does not attempt to limit or alter | |
182 | the recipients' rights in the Source Code Form under this License. | |
183 | ||
184 | 3.3. Distribution of a Larger Work | |
185 | ||
186 | You may create and distribute a Larger Work under terms of Your choice, | |
187 | provided that You also comply with the requirements of this License for | |
188 | the Covered Software. If the Larger Work is a combination of Covered | |
189 | Software with a work governed by one or more Secondary Licenses, and the | |
190 | Covered Software is not Incompatible With Secondary Licenses, this | |
191 | License permits You to additionally distribute such Covered Software | |
192 | under the terms of such Secondary License(s), so that the recipient of | |
193 | the Larger Work may, at their option, further distribute the Covered | |
194 | Software under the terms of either this License or such Secondary | |
195 | License(s). | |
196 | ||
197 | 3.4. Notices | |
198 | ||
199 | You may not remove or alter the substance of any license notices | |
200 | (including copyright notices, patent notices, disclaimers of warranty, | |
201 | or limitations of liability) contained within the Source Code Form of | |
202 | the Covered Software, except that You may alter any license notices to | |
203 | the extent required to remedy known factual inaccuracies. | |
204 | ||
205 | 3.5. Application of Additional Terms | |
206 | ||
207 | You may choose to offer, and to charge a fee for, warranty, support, | |
208 | indemnity or liability obligations to one or more recipients of Covered | |
209 | Software. However, You may do so only on Your own behalf, and not on | |
210 | behalf of any Contributor. You must make it absolutely clear that any | |
211 | such warranty, support, indemnity, or liability obligation is offered by | |
212 | You alone, and You hereby agree to indemnify every Contributor for any | |
213 | liability incurred by such Contributor as a result of warranty, support, | |
214 | indemnity or liability terms You offer. You may include additional | |
215 | disclaimers of warranty and limitations of liability specific to any | |
216 | jurisdiction. | |
217 | ||
218 | 4. Inability to Comply Due to Statute or Regulation | |
219 | --------------------------------------------------- | |
220 | ||
221 | If it is impossible for You to comply with any of the terms of this | |
222 | License with respect to some or all of the Covered Software due to | |
223 | statute, judicial order, or regulation then You must: (a) comply with | |
224 | the terms of this License to the maximum extent possible; and (b) | |
225 | describe the limitations and the code they affect. Such description must | |
226 | be placed in a text file included with all distributions of the Covered | |
227 | Software under this License. Except to the extent prohibited by statute | |
228 | or regulation, such description must be sufficiently detailed for a | |
229 | recipient of ordinary skill to be able to understand it. | |
230 | ||
231 | 5. Termination | |
232 | -------------- | |
233 | ||
234 | 5.1. The rights granted under this License will terminate automatically | |
235 | if You fail to comply with any of its terms. However, if You become | |
236 | compliant, then the rights granted under this License from a particular | |
237 | Contributor are reinstated (a) provisionally, unless and until such | |
238 | Contributor explicitly and finally terminates Your grants, and (b) on an | |
239 | ongoing basis, if such Contributor fails to notify You of the | |
240 | non-compliance by some reasonable means prior to 60 days after You have | |
241 | come back into compliance. Moreover, Your grants from a particular | |
242 | Contributor are reinstated on an ongoing basis if such Contributor | |
243 | notifies You of the non-compliance by some reasonable means, this is the | |
244 | first time You have received notice of non-compliance with this License | |
245 | from such Contributor, and You become compliant prior to 30 days after | |
246 | Your receipt of the notice. | |
247 | ||
248 | 5.2. If You initiate litigation against any entity by asserting a patent | |
249 | infringement claim (excluding declaratory judgment actions, | |
250 | counter-claims, and cross-claims) alleging that a Contributor Version | |
251 | directly or indirectly infringes any patent, then the rights granted to | |
252 | You by any and all Contributors for the Covered Software under Section | |
253 | 2.1 of this License shall terminate. | |
254 | ||
255 | 5.3. In the event of termination under Sections 5.1 or 5.2 above, all | |
256 | end user license agreements (excluding distributors and resellers) which | |
257 | have been validly granted by You or Your distributors under this License | |
258 | prior to termination shall survive termination. | |
259 | ||
260 | ************************************************************************ | |
261 | * * | |
262 | * 6. Disclaimer of Warranty * | |
263 | * ------------------------- * | |
264 | * * | |
265 | * Covered Software is provided under this License on an "as is" * | |
266 | * basis, without warranty of any kind, either expressed, implied, or * | |
267 | * statutory, including, without limitation, warranties that the * | |
268 | * Covered Software is free of defects, merchantable, fit for a * | |
269 | * particular purpose or non-infringing. The entire risk as to the * | |
270 | * quality and performance of the Covered Software is with You. * | |
271 | * Should any Covered Software prove defective in any respect, You * | |
272 | * (not any Contributor) assume the cost of any necessary servicing, * | |
273 | * repair, or correction. This disclaimer of warranty constitutes an * | |
274 | * essential part of this License. No use of any Covered Software is * | |
275 | * authorized under this License except under this disclaimer. * | |
276 | * * | |
277 | ************************************************************************ | |
278 | ||
279 | ************************************************************************ | |
280 | * * | |
281 | * 7. Limitation of Liability * | |
282 | * -------------------------- * | |
283 | * * | |
284 | * Under no circumstances and under no legal theory, whether tort * | |
285 | * (including negligence), contract, or otherwise, shall any * | |
286 | * Contributor, or anyone who distributes Covered Software as * | |
287 | * permitted above, be liable to You for any direct, indirect, * | |
288 | * special, incidental, or consequential damages of any character * | |
289 | * including, without limitation, damages for lost profits, loss of * | |
290 | * goodwill, work stoppage, computer failure or malfunction, or any * | |
291 | * and all other commercial damages or losses, even if such party * | |
292 | * shall have been informed of the possibility of such damages. This * | |
293 | * limitation of liability shall not apply to liability for death or * | |
294 | * personal injury resulting from such party's negligence to the * | |
295 | * extent applicable law prohibits such limitation. Some * | |
296 | * jurisdictions do not allow the exclusion or limitation of * | |
297 | * incidental or consequential damages, so this exclusion and * | |
298 | * limitation may not apply to You. * | |
299 | * * | |
300 | ************************************************************************ | |
301 | ||
302 | 8. Litigation | |
303 | ------------- | |
304 | ||
305 | Any litigation relating to this License may be brought only in the | |
306 | courts of a jurisdiction where the defendant maintains its principal | |
307 | place of business and such litigation shall be governed by laws of that | |
308 | jurisdiction, without reference to its conflict-of-law provisions. | |
309 | Nothing in this Section shall prevent a party's ability to bring | |
310 | cross-claims or counter-claims. | |
311 | ||
312 | 9. Miscellaneous | |
313 | ---------------- | |
314 | ||
315 | This License represents the complete agreement concerning the subject | |
316 | matter hereof. If any provision of this License is held to be | |
317 | unenforceable, such provision shall be reformed only to the extent | |
318 | necessary to make it enforceable. Any law or regulation which provides | |
319 | that the language of a contract shall be construed against the drafter | |
320 | shall not be used to construe this License against a Contributor. | |
321 | ||
322 | 10. Versions of the License | |
323 | --------------------------- | |
324 | ||
325 | 10.1. New Versions | |
326 | ||
327 | Mozilla Foundation is the license steward. Except as provided in Section | |
328 | 10.3, no one other than the license steward has the right to modify or | |
329 | publish new versions of this License. Each version will be given a | |
330 | distinguishing version number. | |
331 | ||
332 | 10.2. Effect of New Versions | |
333 | ||
334 | You may distribute the Covered Software under the terms of the version | |
335 | of the License under which You originally received the Covered Software, | |
336 | or under the terms of any subsequent version published by the license | |
337 | steward. | |
338 | ||
339 | 10.3. Modified Versions | |
340 | ||
341 | If you create software not governed by this License, and you want to | |
342 | create a new license for such software, you may create and use a | |
343 | modified version of this License if you rename the license and remove | |
344 | any references to the name of the license steward (except to note that | |
345 | such modified license differs from this License). | |
346 | ||
347 | 10.4. Distributing Source Code Form that is Incompatible With Secondary | |
348 | Licenses | |
349 | ||
350 | If You choose to distribute Source Code Form that is Incompatible With | |
351 | Secondary Licenses under the terms of this version of the License, the | |
352 | notice described in Exhibit B of this License must be attached. | |
353 | ||
354 | Exhibit A - Source Code Form License Notice | |
355 | ------------------------------------------- | |
356 | ||
357 | This Source Code Form is subject to the terms of the Mozilla Public | |
358 | License, v. 2.0. If a copy of the MPL was not distributed with this | |
359 | file, You can obtain one at http://mozilla.org/MPL/2.0/. | |
360 | ||
361 | If it is not possible or desirable to put the notice in a particular | |
362 | file, then You may include the notice in a location (such as a LICENSE | |
363 | file in a relevant directory) where a recipient would be likely to look | |
364 | for such a notice. | |
365 | ||
366 | You may add additional accurate notices of copyright ownership. | |
367 | ||
368 | Exhibit B - "Incompatible With Secondary Licenses" Notice | |
369 | --------------------------------------------------------- | |
370 | ||
371 | This Source Code Form is "Incompatible With Secondary Licenses", as | |
372 | defined by the Mozilla Public License, v. 2.0.⏎ |
0 | <#-- To render the third-party file. | |
1 | Available context : | |
2 | ||
3 | - dependencyMap a collection of Map.Entry with | |
4 | key are dependencies (as a MavenProject) (from the maven project) | |
5 | values are licenses of each dependency (array of string) | |
6 | ||
7 | - licenseMap a collection of Map.Entry with | |
8 | key are licenses of each dependency (array of string) | |
9 | values are all dependencies using this license | |
10 | --> | |
11 | <#function licenseFormat licenses> | |
12 | <#assign result = " "/> | |
13 | <#list licenses as license> | |
14 | <#assign result = result + license/> | |
15 | </#list> | |
16 | <#return result> | |
17 | </#function> | |
18 | <#function artifactFormat p> | |
19 | <#return " " + p.name + ", v" + p.version + "\n"+ " by "+ p.organization.name +" (" + (p.url!"no url defined") + ")"> | |
20 | </#function> | |
21 | Licenses of third-party dependencies | |
22 | ------------------------------------ | |
23 | ||
24 | <#list dependencyMap as e> | |
25 | <#assign project = e.getKey()/> | |
26 | <#assign licenses = e.getValue()/> | |
27 | ${project.name}, ${project.version} | |
28 | <#-- by ${project.organization.name} (${project.url}) --> | |
29 | <#list licenses as license> | |
30 | ${license} | |
31 | </#list> | |
32 | ||
33 | </#list> | |
34 | ||
35 | Copies of the licenses are provided in the 'licenses' directory.⏎ |
0 | isorelax--isorelax--20030108=The 3-clause BSD License⏎ |
0 | 0 | W3C® SOFTWARE NOTICE AND LICENSE |
1 | 1 | |
2 | Copyright © 1994-2002 World Wide Web Consortium, (Massachusetts Institute of Technology, Institut National de Recherche en Informatique et en Automatique, Keio University). All Rights Reserved. http://www.w3.org/Consortium/Legal/ | |
2 | Copyright © 1994-2002 World Wide Web Consortium, (Massachusetts Institute of | |
3 | Technology, Institut National de Recherche en Informatique et en Automatique, | |
4 | Keio University). All Rights Reserved. http://www.w3.org/Consortium/Legal/ | |
3 | 5 | |
4 | This W3C work (including software, documents, or other related items) is being provided by the copyright holders under the following license. By obtaining, using and/or copying this work, you (the licensee) agree that you have read, understood, and will comply with the following terms and conditions: | |
6 | This W3C work (including software, documents, or other related items) is being | |
7 | provided by the copyright holders under the following license. By obtaining, | |
8 | using and/or copying this work, you (the licensee) agree that you have read, | |
9 | understood, and will comply with the following terms and conditions: | |
5 | 10 | |
6 | Permission to use, copy, modify, and distribute this software and its documentation, with or without modification, for any purpose and without fee or royalty is hereby granted, provided that you include the following on ALL copies of the software and documentation or portions thereof, including modifications, that you make: | |
11 | Permission to use, copy, modify, and distribute this software and its | |
12 | documentation, with or without modification, for any purpose and without fee or | |
13 | royalty is hereby granted, provided that you include the following on ALL copies | |
14 | of the software and documentation or portions thereof, including modifications, | |
15 | that you make: | |
7 | 16 | |
8 | The full text of this NOTICE in a location viewable to users of the redistributed or derivative work. | |
9 | Any pre-existing intellectual property disclaimers, notices, or terms and conditions. If none exist, a short notice of the following form (hypertext is preferred, text is permitted) should be used within the body of any redistributed or derivative code: "Copyright © [$date-of-software] World Wide Web Consortium, (Massachusetts Institute of Technology, Institut National de Recherche en Informatique et en Automatique, Keio University). All Rights Reserved. http://www.w3.org/Consortium/Legal/" | |
10 | Notice of any changes or modifications to the W3C files, including the date changes were made. (We recommend you provide URIs to the location from which the code is derived.) | |
11 | THIS SOFTWARE AND DOCUMENTATION IS PROVIDED "AS IS," AND COPYRIGHT HOLDERS MAKE NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO, WARRANTIES OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE OR DOCUMENTATION WILL NOT INFRINGE ANY THIRD PARTY PATENTS, COPYRIGHTS, TRADEMARKS OR OTHER RIGHTS. | |
17 | The full text of this NOTICE in a location viewable to users of the | |
18 | redistributed or derivative work. Any pre-existing intellectual property | |
19 | disclaimers, notices, or terms and conditions. If none exist, a short notice of | |
20 | the following form (hypertext is preferred, text is permitted) should be used | |
21 | within the body of any redistributed or derivative code: "Copyright © | |
22 | [$date-of-software] World Wide Web Consortium, (Massachusetts Institute of | |
23 | Technology, Institut National de Recherche en Informatique et en Automatique, | |
24 | Keio University). All Rights Reserved. http://www.w3.org/Consortium/Legal/" | |
25 | Notice of any changes or modifications to the W3C files, including the date | |
26 | changes were made. (We recommend you provide URIs to the location from which the | |
27 | code is derived.) THIS SOFTWARE AND DOCUMENTATION IS PROVIDED "AS IS," AND | |
28 | COPYRIGHT HOLDERS MAKE NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED, | |
29 | INCLUDING BUT NOT LIMITED TO, WARRANTIES OF MERCHANTABILITY OR FITNESS FOR ANY | |
30 | PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE OR DOCUMENTATION WILL NOT | |
31 | INFRINGE ANY THIRD PARTY PATENTS, COPYRIGHTS, TRADEMARKS OR OTHER RIGHTS. | |
12 | 32 | |
13 | COPYRIGHT HOLDERS WILL NOT BE LIABLE FOR ANY DIRECT, INDIRECT, SPECIAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF ANY USE OF THE SOFTWARE OR DOCUMENTATION. | |
33 | COPYRIGHT HOLDERS WILL NOT BE LIABLE FOR ANY DIRECT, INDIRECT, SPECIAL OR | |
34 | CONSEQUENTIAL DAMAGES ARISING OUT OF ANY USE OF THE SOFTWARE OR DOCUMENTATION. | |
14 | 35 | |
15 | The name and trademarks of copyright holders may NOT be used in advertising or publicity pertaining to the software without specific, written prior permission. Title to copyright in this software and any associated documentation will at all times remain with copyright holders.⏎ | |
36 | The name and trademarks of copyright holders may NOT be used in advertising or | |
37 | publicity pertaining to the software without specific, written prior permission. | |
38 | Title to copyright in this software and any associated documentation will at all | |
39 | times remain with copyright holders.⏎ |
0 | <#-- To render the third-party file. | |
1 | Available context : | |
2 | ||
3 | - dependencyMap a collection of Map.Entry with | |
4 | key are dependencies (as a MavenProject) (from the maven project) | |
5 | values are licenses of each dependency (array of string) | |
6 | ||
7 | - licenseMap a collection of Map.Entry with | |
8 | key are licenses of each dependency (array of string) | |
9 | values are all dependencies using this license | |
10 | --> | |
11 | <#function licenseFormat licenses> | |
12 | <#assign result = " "/> | |
13 | <#list licenses as license> | |
14 | <#assign result = result + license/> | |
15 | </#list> | |
16 | <#return result> | |
17 | </#function> | |
18 | <#function artifactFormat p> | |
19 | <#return " " + p.name + ", v" + p.version + "\n"+ " by "+ p.organization.name +" (" + (p.url!"no url defined") + ")"> | |
20 | </#function> | |
21 | Licenses of third-party dependencies | |
22 | ------------------------------------ | |
23 | ||
24 | <#list dependencyMap as e> | |
25 | <#assign project = e.getKey()/> | |
26 | <#assign licenses = e.getValue()/> | |
27 | ${project.name}, ${project.version} | |
28 | <#-- by ${project.organization.name} (${project.url}) --> | |
29 | <#list licenses as license> | |
30 | ${license} | |
31 | </#list> | |
32 | ||
33 | </#list> | |
34 | ||
35 | Copies of the licenses are provided in the 'licenses' directory.⏎ |
130 | 130 | MED_005=Media Overlay audio reference %1$s to non-standard audio type %2$s found. |
131 | 131 | MED_006=Some browsers do not support rendering SVG images which use a filename in the xlink:href property. |
132 | 132 | MED_007=Foreign resources can only be referenced from "source" elements with an explicit "type" attribute; found resource "%1$s" of foreign type "%2$s". |
133 | MED_008=The time specified in the clipBegin attribute must not be after clipEnd. | |
134 | MED_009=The time specified in the clipBegin attribute must not be the same as clipEnd. | |
135 | MED_010=EPUB Content Documents referenced from a Media Overlay must specify the "media-overlay" attribute. | |
136 | MED_011=EPUB Content Document referenced from multiple Media Overlay Documents. | |
137 | MED_012=The "media-overlay" attribute does not match the ID of the Media Overlay that refers to this document. | |
138 | MED_013=Media Overlay Document referenced from the "media-overlay" attribute does not contain a reference to this Content Document. | |
139 | MED_014=A non-empty fragment identifier is required. | |
140 | MED_015=Media overlay text references must be in reading order. Text target "%1$s" is before the previous link target in %2$s order. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=The nav file is not supported for EPUB v2. |
0 | 0 | # This is the default MessageBundle.properties file |
1 | 1 | |
2 | 2 | #Info |
3 | INF_001=The previous rule is under review and its severity may change in a future release. See the discussion at %1$s | |
3 | INF_001=Den forrige regel er under gennemgang, og dens sværhedsgrad kan ændre sig i en fremtidig release. Se overvejelser %1$s | |
4 | 4 | |
5 | 5 | #Accessibility |
6 | 6 | ACC_001=Et "img"- eller "area"-HTML-element mangler en "alt"-attribut. |
126 | 126 | MED_001=Video-stillbilledet ("poster") skal være i et OPF-understøttet filformat til billeder. |
127 | 127 | MED_002=Ingen fallback er defineret for elementet %1$s. |
128 | 128 | MED_003=En fallback i manifestet er påkrævet, når billed-resursen "%1$s" er filtype: "%2$s". |
129 | MED_004=Billedfilens overskrift kan være beskadiget. | |
129 | MED_004=Billedfilens 'header' kan være beskadiget. | |
130 | 130 | MED_005="Media Overlay" fandt en lydreference %1$s til en ikke-understøttet lydtype %2$s. |
131 | 131 | MED_006=Nogle browsers understøtter ikke gengivelse af SVG-grafik, når der er filnavn i attributten "xlink:href". |
132 | 132 | MED_007=Der kan kun henvises til eksterne resurser fra "source"-elementer, hvis de har en bestemt "type"-attribut; fandt denne resurse "%1$s" som er ekstern filtype "%2$s". |
133 | MED_008=Tidsangivelsen i "clipBegin"-attributten må ikke være senere end den angivet i "clipEnd". | |
134 | MED_009=Tidsangivelsen i "clipBegin" må ikke være identisk med den angivet i "clipEnd". | |
135 | MED_010=EPUB indholdsdokumenter der henvises til fra et Media Overlay skal indeholde "media-overlay"-attributten. | |
136 | MED_011=Der henvises til dette EPUB indholdsdokument fra flere Media Overlay-dokumenter. | |
137 | MED_012=Attributten "media-overlay" matcher ikke id'et for det Medie Overlay, der henviser til dette dokument. | |
138 | MED_013=Media Overlay-dokumentet, som der henvises til fra "media-overlay"-attributten, indeholder ikke en reference til dette indholdsdokument. | |
139 | MED_014=Fragment-id'et må ikke være tomt men skal udfyldes. | |
140 | MED_015=Text-referencer i Media Overlays skal komme i korrekt læserækkefølge. Text-target "%1$s" kommer før det forgående link-target i rækkefølgen %2$s . | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=EPUB 3 navigationsfilen understøttes ikke af EPUB 2. |
325 | 333 | RSC_020="%1$s" er ikke en valid URI. |
326 | 334 | RSC_021=Der blev fundet en EPUB-ordbog med resursen "%1$s", hvilket hverken er et indeks eller et XHTML-indholdsdokument. |
327 | 335 | RSC_022=Billedoplysninger kan ikke kontrolleres (kræver Java 7 eller nyere version). |
328 | RSC_023=Couldn’t parse host of URL "%1$s" (probably due to disallowed characters or missing slashes after the protocol) | |
336 | RSC_023=Kunne ikke analysere vært med URL’en "%1$s" (sandsynligvis på grund af ikke-tilladte tegn eller en manglende skråstreg efter protokollen) | |
329 | 337 | |
330 | 338 | #Scripting |
331 | 339 | SCP_001=Brug af JavaScript-funktionen "eval()" i EPUB-filer udgør en sikkerhedsrisiko. |
130 | 130 | MED_005=Im MediaOverlay wurde eine Audio-Referenz "%1$s" zu einem nicht unterstützten Audio-Format "%2$s" gefunden. |
131 | 131 | MED_006=Hinweis: Einige Browser unterstützen die Darstellung von SVG-Grafiken nicht, die einen Dateinamen im "xlink:href"-Attribut verwenden. |
132 | 132 | MED_007=Externe Ressourcen können nur über ein "source"-Element mit explizitem "type"-Attribut referenziert werden. Ressource "%1$s" vom Typ "%2$s" gefunden. |
133 | MED_008=The time specified in the clipBegin attribute must not be after clipEnd. | |
134 | MED_009=The time specified in the clipBegin attribute must not be the same as clipEnd. | |
135 | MED_010=EPUB Content Documents referenced from a Media Overlay must specify the "media-overlay" attribute. | |
136 | MED_011=EPUB Content Document referenced from multiple Media Overlay Documents. | |
137 | MED_012=The "media-overlay" attribute does not match the ID of the Media Overlay that refers to this document. | |
138 | MED_013=Media Overlay Document referenced from the "media-overlay" attribute does not contain a reference to this Content Document. | |
139 | MED_014=A non-empty fragment identifier is required. | |
140 | MED_015=Media overlay text references must be in reading order. Text target "%1$s" is before the previous link target in %2$s order. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=Die EPUB 3-Navigationsdatei wird von EPUB 2 nicht unterstützt. |
130 | 130 | MED_005=Se ha encontrado un recurso de audio Media Overlay %1$s que hace referencia a un tipo de audio no estándar %2$s. |
131 | 131 | MED_006=Algunos navegadores no soportan imágenes SVG que utilizan nombres de archivo en la propiedad xlink:href. |
132 | 132 | MED_007=Los atributos ajenos solo pueden referenciarse desde elementos "source" con un atributo explícito "type". Se ha encontrado el recurso "%1$s" de tipo "%2$s". |
133 | MED_008=The time specified in the clipBegin attribute must not be after clipEnd. | |
134 | MED_009=The time specified in the clipBegin attribute must not be the same as clipEnd. | |
135 | MED_010=EPUB Content Documents referenced from a Media Overlay must specify the "media-overlay" attribute. | |
136 | MED_011=EPUB Content Document referenced from multiple Media Overlay Documents. | |
137 | MED_012=The "media-overlay" attribute does not match the ID of the Media Overlay that refers to this document. | |
138 | MED_013=Media Overlay Document referenced from the "media-overlay" attribute does not contain a reference to this Content Document. | |
139 | MED_014=A non-empty fragment identifier is required. | |
140 | MED_015=Media overlay text references must be in reading order. Text target "%1$s" is before the previous link target in %2$s order. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=El archivo nav no está soportado en EPUB v2. |
130 | 130 | MED_005=La référence audio Media Overlay %1$s d’un type audio non-standard %2$s a été trouvée. |
131 | 131 | MED_006=Certains navigateurs ne supportent pas les images rendues en SVG qui utilisent un nom de fichier dans la propriété xlink:href. |
132 | 132 | MED_007=Les ressources étrangères ne peuvent être référencées qu’à partir d’éléments "source" avec un attribut "type" explicite ; ressource trouvée "%1$s" de type étranger "%2$s". |
133 | MED_008=Le temps indiqué dans clipBegin ne doit pas être après clipEnd. | |
134 | MED_009=Le temps indiqué dans clipBegin ne doit pas être le même que dans clipEnd | |
135 | MED_010=EPUB Content Documents référencé à partir d'un Media Overlay doit spécifier l'attribut "media-overlay". | |
136 | MED_011=EPUB Content Documents référencé à partir de multiples Media Overlay Documents. | |
137 | MED_012=L'attribut "media-overlay" ne correspond pas à l'ID du Media Overlay qui se réfère à ce document. | |
138 | MED_013=Media Overlay Document référencé à partir de l'attribut "media-overlay" ne contient pas une référence à ce Content Document. | |
139 | MED_014=Un fragment d'identifiant non vide est requis. | |
140 | MED_015=Les références de texte Media Overlay doivent être en ordre de lecture. Texte cible "%1$s" est avant le lien cible précédent dans l'ordre %2$s. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=Le fichier nav n’est pas supporté pour les EPUB v2. |
130 | 130 | MED_005=È stato trovato un file Media Overlay contenente il riferimento "%1$s" a un file audio di tipo non standard %2$s. |
131 | 131 | MED_006=Alcuni browser non visualizzano immagini SVG che abbiano un nome di file come valore della proprietà "xlink:href". |
132 | 132 | MED_007=Le risorse esterne possono essere referenziate solo da elementi "source" con un attributo "type" esplicito; trovata risorsa "%1$s" di tipo "%2$s". |
133 | MED_008=Il tempo specificato nell'attributo clipBegin non deve essere successivo a clipEnd. | |
134 | MED_009=Il tempo specificato nell'attributo clipBegin non deve essere lo stesso di clipEnd. | |
135 | MED_010=Gli EPUB Content Documents a cui si fa riferimento da un Media Overlay devono specificare l'attributo "media-overlay". | |
136 | MED_011=EPUB Content Document referenziato da più documenti Media Overlay. | |
137 | MED_012=L'attributo "media-overlay" non corrisponde all'ID del Media Overlay che fa riferimento a questo documento. | |
138 | MED_013=Il documento Media Overlay a cui fa riferimento l'attributo "overlay multimediale" non contiene un riferimento a questo Content Document. | |
139 | MED_014=È richiesto un identificatore di frammento non vuoto. | |
140 | MED_015=I riferimenti al testo Media overlay devono essere nell'ordine di lettura. La destinazione del testo "%1$s" è prima della destinazione del collegamento precedente nell'ordine %2$s | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=EPUB 2 non prevede l'uso di documenti di navigazione "nav" (Navigation Document). |
130 | 130 | MED_005=非標準の音声タイプ %2$s へのメディアオーバーレイの音声の参照 %1$s が見つかりました. |
131 | 131 | MED_006=いくつかのブラウザではxlink:hrefプロパティ内でファイル名を使ったSVG画像の描画をサポートしていません. |
132 | 132 | MED_007=外部リソースは明示的な "type" 属性のある "source" 要素からのみ参照できます; 外部属性 "%2$s" のリソース "%1$s" がありました. |
133 | MED_008=clipBegin 属性で指定された時刻はclipEndの後になることはできません. | |
134 | MED_009=clipBegin属性で指定した時刻はclipEndと同じにしてはいけません. | |
135 | MED_010=メディアオーバーレイから参照されるEPUBコンテンツ文書は "media-overlay" 属性で指定しなければなりません. | |
136 | MED_011=EPUBコンテンツ文書が複数のメディアオーバーレイ文書から参照されています. | |
137 | MED_012="media-overlay"属性がこの文書を参照しているメディアオーバーレイのIDと一致しません. | |
138 | MED_013="media-overlay" 属性から参照されているメディアオーバーレイ文書がこのコンテンツ文書への参照に含まれていません. | |
139 | MED_014=空ではないフラグメント識別子が必要です. | |
140 | MED_015=メディアオーバーレイテキストの参照は閲覧順にしなければなりません. "%1$s" のテキストターゲットが %2$s にあるその前のリンクターゲットの前にあります. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=navファイルはEPUB v2ではサポートされていません. |
130 | 130 | MED_005=미디어 오버레이 오디오를 참조한 %1$s에서 비표준 오디오 유형 %2$s 를 발견했다. |
131 | 131 | MED_006=xlink : href 속성에서 사용하는 파일명의 SVG 이미지는 일부 브라우저에서 렌더링을 지원하지 않습니다. |
132 | 132 | MED_007=외부 자원은 명시적 "type"속성을 가진 "source"요소에서만 참조 할 수 있습니다. 외부 유형 "%2$s"의 리소스 "%1$s"을(를) 발견했습니다. |
133 | MED_008=clipBegin 속성에 지정된 시간은 clipEnd 시간보다 빨라야합니다. | |
134 | MED_009=clipBegin 속성에 지정된 시간은 clipEnd와 같을 수 없습니다. | |
135 | MED_010=미디어 오버레이에서 참조되는 EPUB 콘텐츠 문서는 "media-overlay"속성을 지정해야합니다. | |
136 | MED_011=여러 미디어 오버레이 문서에서 참조된 EPUB 콘텐츠 문서입니다. | |
137 | MED_012="media-overlay"속성이 이 문서를 참조하는 미디어 오버레이의 ID와 일치하지 않습니다. | |
138 | MED_013="media-overlay"속성에서 참조 된 미디어 오버레이 문서에 이 콘텐츠 문서에 대한 참조가 없습니다. | |
139 | MED_014=non-empty fragment identifier가 필요합니다. | |
140 | MED_015=미디어 오버레이 텍스트 참조는 읽기 순서 여야합니다. 대상 텍스트 "%1$s"은 %2$s 의 이전 링크 대상 앞에 있습니다. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001="nav" 파일은 ePub v2에서 지원하지 않습니다. |
183 | 191 | OPF_018="remote-resources" 속성이 패키지 문서에서 선언되었지만 원격 리소스에 대한 참조가 발견되지 않았습니다. |
184 | 192 | OPF_018b="remote-resources" 속성은 패키지 문서에 선언되었지만 원격 리소스에 대한 참조가 발견되지 않았습니다. 적합한 속성인지 scripted 콘텐츠를 확인하십시오. |
185 | 193 | OPF_019=OPF 파일에서 "spine" 태그를 찾을 수 없습니다. |
186 | OPF_020=스파인이 너무 많습니다. | |
194 | OPF_020=spine이 너무 많습니다. | |
187 | 195 | OPF_021=href : "%1$s"에 등록되지 않은 URI 스키마 유형을 사용합니다. |
188 | 196 | OPF_025=여러개의 속성 값 "%1$s"은(는) 여기에 사용할 수 없습니다. 하나의 값만 지정할 수 있습니다. |
189 | OPF_026=잘못된 형식의 속성 값 : "%1$s"이 발견되었습니다. | |
197 | OPF_026=잘못된 형식의 속성 값 : "%1$s"이(가) 발견되었습니다. | |
190 | 198 | OPF_027=정의되지 않은 속성 : "%1$s". |
191 | 199 | OPF_028=선언되지 않은 접두어 : "%1$s". |
192 | 200 | OPF_029=파일 "%1$s"은 OPF 파일에 지정된 %2$s 미디어 유형과 일치하지 않습니다. |
194 | 202 | OPF_031=가이드의 참조 요소에 나열된 파일이 OPF 매니페스트 : %1$s에 선언되지 않았습니다. |
195 | 203 | OPF_032=올바른 "OPS 콘텐츠 문서"가 아닌 가이드 참조 "%1$s"입니다. |
196 | 204 | OPF_033=spine에 linear 리소스가 포함되어 있지 않습니다. |
197 | OPF_034=spine에는 ID "%1$s"이 있는 매니페스트 항목에 대한 여러 참조가 포함되어 있습니다. | |
205 | OPF_034=spine에는 ID "%1$s"이(가) 있는 매니페스트 항목에 대한 여러 참조가 포함되어 있습니다. | |
198 | 206 | OPF_035="text/html"미디어 유형은 XHTML/OPS에 적합하지 않습니다. |
199 | 207 | OPF_035_SUG=대신 "application/xhtml+xml"을 사용세요. |
200 | OPF_036=리딩시스템이 "%1$s"비디오 유형을 지원하지 않을 수 있습니다 | |
208 | OPF_036=리딩시스템이 "%1$s" 비디오 유형을 지원하지 않을 수 있습니다 | |
201 | 209 | OPF_036_SUG=대신 "video/mp4", "video/h264" or "video/webm" 비디오 유형을 사용하세요. |
202 | OPF_037=사용되지 않는 미디어 유형 "%1$s"이 발견되었습니다. | |
203 | OPF_038=미디어 유형 "%1$s"은 OEBPS 1.2 컨텍스트에 적합하지 않습니다. 대신 "text/x-oeb1-document"를 사용하십시오. | |
204 | OPF_039=OEBPS 1.2 컨텍스트에서 미디어 유형 "%1$s"이 적합하지 않습니다. 대신 "text/x-oeb1-css"를 사용하십시오. | |
210 | OPF_037=사용되지 않는 미디어 유형 "%1$s"이(가) 발견되었습니다. | |
211 | OPF_038=미디어 유형 "%1$s"은(는) OEBPS 1.2 컨텍스트에 적합하지 않습니다. 대신 "text/x-oeb1-document"를 사용하십시오. | |
212 | OPF_039=OEBPS 1.2 컨텍스트에서 미디어 유형 "%1$s"이(가) 적합하지 않습니다. 대신 "text/x-oeb1-css"를 사용하십시오. | |
205 | 213 | OPF_040=Fallback item을 찾을 수 없습니다. |
206 | 214 | OPF_041=Fallback-style item을 찾을 수 없습니다. |
207 | 215 | OPF_042="%1$s"은 허용되지 않는 spine media-type입니다. |
214 | 222 | OPF_049=아아템 id " %1$s "를 manifest에서 찾을 수 없습니다. |
215 | 223 | OPF_050=TOC 속성이 non-NCX MIME 유형을 가진 리소스를 참조합니다. "application/x-dtbncx+xml"이 필요합니다. |
216 | 224 | OPF_051=이미지 크기가 권장 크기를 초과합니다. |
217 | OPF_052=Role value "%1$s"이 유효하지 않습니다. | |
218 | OPF_053=Date value "%1$s"은 http://www.w3.org/TR/NOTE-datetime:%2$s의 권장 구문을 따르지 않습니다. | |
219 | OPF_054=Date value "%1$s"은 http://www.w3.org/TR/NOTE-datetime:%2$s의 기준에 적합하지 않습니다. | |
225 | OPF_052=Role value "%1$s"이(가) 유효하지 않습니다. | |
226 | OPF_053=Date value "%1$s"은(는) http://www.w3.org/TR/NOTE-datetime:%2$s의 권장 구문을 따르지 않습니다. | |
227 | OPF_054=Date value "%1$s"은(는) http://www.w3.org/TR/NOTE-datetime:%2$s의 기준에 적합하지 않습니다. | |
220 | 228 | OPF_055=%1$s 태그가 비어있습니다. |
221 | OPF_056=미디어 유형 "%1$s"은 core audio type이 아닙니다. | |
229 | OPF_056=미디어 유형 "%1$s"은(는) core audio type이 아닙니다. | |
222 | 230 | OPF_057=이미지 파일의 길이가 권장 크기를 초과했습니다. |
223 | 231 | OPF_058=Spine 항목 "%1$s"은 탐색 문서의 목차에서 참조되지 않습니다. |
224 | 232 | OPF_058_SUG=manifest의 모든 spine 항목은 Nav Doc에서 하나 이상의 TOC 항목에 의해 참조되어야합니다. |
240 | 248 | OPF_073=외부 식별자는 문서 유형 선언에 나타날 수 없습니다. |
241 | 249 | OPF_074=패키지 리소스 " %1$s "가 menifest 아이템으로 여러번 정의되었습니다. |
242 | 250 | OPF_075=Preview collections은 EPUB 콘텐츠 문서를 가리켜 야합니다. |
243 | OPF_076=preview collections link 요소의 URI에는 EPUB 표준 조각 식별자가 포함되어서는 안됩니다.The URI of preview collections link elements must not include EPUB canonical fragment identifiers. | |
251 | OPF_076=preview collections link 요소의 URI에는 EPUB 표준 조각 식별자가 포함되어서는 안됩니다. The URI of preview collections link elements must not include EPUB canonical fragment identifiers. | |
244 | 252 | OPF_077=Data Navigation Document는 spine에 포함될 수 없습니다. |
245 | 253 | OPF_078=EPUB Dictionary는 dictionary 콘텐츠 (epub:type "dictionary")가 포함 된 콘텐츠 문서가 하나 이상 포함되어야합니다. |
246 | 254 | OPF_079=Dictionary content가 발견 된 경우 (epub:type "dictionary"), 패키지 문서는 dc : type "dictionary"을 선언해야합니다. |
130 | 130 | MED_005=De aangetroffen audioreferentie van Media Overlay %1$s naar audio type %2$s is niet-standaard. |
131 | 131 | MED_006=Sommige browsers hebben geen ondersteuning voor het weergeven van SVG beelden die een bestandsnaam in de xlink:href eigenschap gebruiken. |
132 | 132 | MED_007=Foreign resources can only be referenced from "source" elements with an explicit "type" attribute; found resource "%1$s" of foreign type "%2$s". |
133 | MED_008=The time specified in the clipBegin attribute must not be after clipEnd. | |
134 | MED_009=The time specified in the clipBegin attribute must not be the same as clipEnd. | |
135 | MED_010=EPUB Content Documents referenced from a Media Overlay must specify the "media-overlay" attribute. | |
136 | MED_011=EPUB Content Document referenced from multiple Media Overlay Documents. | |
137 | MED_012=The "media-overlay" attribute does not match the ID of the Media Overlay that refers to this document. | |
138 | MED_013=Media Overlay Document referenced from the "media-overlay" attribute does not contain a reference to this Content Document. | |
139 | MED_014=A non-empty fragment identifier is required. | |
140 | MED_015=Media overlay text references must be in reading order. Text target "%1$s" is before the previous link target in %2$s order. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=Het nav bestand wordt in ePub v2 niet ondersteund. |
130 | 130 | MED_005=Referência "%1$s" do áudio de Media Overlay a tipo de áudio não padronizado "%2$s" encontrado. |
131 | 131 | MED_006=Alguns navegadores não suportam renderização de imagens SVG que usem um nome de arquivo na propriedade xlink:href. |
132 | 132 | MED_007=Recursos em formatos não previstos só podem ser referenciados a partir de elementos "source" com um atributo "type" explícito; encontrado o recurso "%1$s" do tipo não previsto "%2$s". |
133 | MED_008=The time specified in the clipBegin attribute must not be after clipEnd. | |
134 | MED_009=The time specified in the clipBegin attribute must not be the same as clipEnd. | |
135 | MED_010=EPUB Content Documents referenced from a Media Overlay must specify the "media-overlay" attribute. | |
136 | MED_011=EPUB Content Document referenced from multiple Media Overlay Documents. | |
137 | MED_012=The "media-overlay" attribute does not match the ID of the Media Overlay that refers to this document. | |
138 | MED_013=Media Overlay Document referenced from the "media-overlay" attribute does not contain a reference to this Content Document. | |
139 | MED_014=A non-empty fragment identifier is required. | |
140 | MED_015=Media overlay text references must be in reading order. Text target "%1$s" is before the previous link target in %2$s order. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=O arquivo de navegação não é suportado para EPUB v2. |
130 | 130 | MED_005=發現語音朗讀使用的音檔 %1$s 是非標準的音訊類型 %2$s。 |
131 | 131 | MED_006=有些瀏覽器不支援在 xlink:href 屬性中使用檔案名稱的 SVG 圖片顯示。 |
132 | 132 | MED_007=外部資源只能透過標註有「type」特性的「source」元件引用。此處有型別為「%2$s」的外部資源「%1$s」,請處理。 |
133 | MED_008=The time specified in the clipBegin attribute must not be after clipEnd. | |
134 | MED_009=The time specified in the clipBegin attribute must not be the same as clipEnd. | |
135 | MED_010=EPUB Content Documents referenced from a Media Overlay must specify the "media-overlay" attribute. | |
136 | MED_011=EPUB Content Document referenced from multiple Media Overlay Documents. | |
137 | MED_012=The "media-overlay" attribute does not match the ID of the Media Overlay that refers to this document. | |
138 | MED_013=Media Overlay Document referenced from the "media-overlay" attribute does not contain a reference to this Content Document. | |
139 | MED_014=A non-empty fragment identifier is required. | |
140 | MED_015=Media overlay text references must be in reading order. Text target "%1$s" is before the previous link target in %2$s order. | |
133 | 141 | |
134 | 142 | #NAV EPUB v3 Table of contents |
135 | 143 | NAV_001=EPUB 2版不支援 nav 檔案。 |
5 | 5 | |
6 | 6 | <include href="./mod/id-unique.sch"/> |
7 | 7 | |
8 | <pattern id="clip-attribute-checks"> | |
9 | <rule context="s:audio[@clipBegin and @clipEnd]"> | |
10 | <!-- #568 check @clipBegin==@clipEnd --> | |
11 | <assert test="@clipBegin != @clipEnd">Attributes "clipBegin" and "clipEnd" must not be equal!</assert> | |
12 | </rule> | |
13 | </pattern> | |
14 | ||
15 | 8 | </schema> |
216 | 216 | ) |
217 | 217 | summary.inner = |
218 | 218 | ( common.inner.phrasing |
219 | | h1.elem | |
220 | | h2.elem | |
221 | | h3.elem | |
222 | | h4.elem | |
223 | | h5.elem | |
224 | | h6.elem | |
225 | | hgroup.elem | |
226 | ) | |
219 | & h1.elem? | |
220 | & h2.elem? | |
221 | & h3.elem? | |
222 | & h4.elem? | |
223 | & h5.elem? | |
224 | & h6.elem? | |
225 | & hgroup.elem? | |
226 | ) |
128 | 128 | & scripting.attr.ondrag? |
129 | 129 | & scripting.attr.ondragend? |
130 | 130 | & scripting.attr.ondragenter? |
131 | & scripting.attr.ondragexit? | |
132 | 131 | & scripting.attr.ondragleave? |
133 | 132 | & scripting.attr.ondragover? |
134 | 133 | & scripting.attr.ondragstart? |
138 | 137 | & scripting.attr.onended? |
139 | 138 | & scripting.attr.onerror? |
140 | 139 | & scripting.attr.onfocus? |
140 | & scripting.attr.onfocusin? | |
141 | & scripting.attr.onfocusout? | |
141 | 142 | & scripting.attr.onformdata? |
142 | 143 | & scripting.attr.oninput? |
143 | 144 | & scripting.attr.oninvalid? |
174 | 175 | & scripting.attr.onsuspend? |
175 | 176 | & scripting.attr.ontimeupdate? |
176 | 177 | & scripting.attr.ontoggle? |
178 | & scripting.attr.ontransitioncancel? | |
179 | & scripting.attr.ontransitionend? | |
180 | & scripting.attr.ontransitionrun? | |
181 | & scripting.attr.ontransitionstart? | |
177 | 182 | & scripting.attr.onvolumechange? |
178 | 183 | & scripting.attr.onwaiting? |
179 | 184 | & scripting.attr.onwheel? |
215 | 220 | attribute ondragend { common.data.functionbody } |
216 | 221 | scripting.attr.ondragenter = |
217 | 222 | attribute ondragenter { common.data.functionbody } |
218 | scripting.attr.ondragexit = | |
219 | attribute ondragexit { common.data.functionbody } | |
220 | 223 | scripting.attr.ondragleave = |
221 | 224 | attribute ondragleave { common.data.functionbody } |
222 | 225 | scripting.attr.ondragover = |
235 | 238 | attribute onerror { common.data.functionbody } |
236 | 239 | scripting.attr.onfocus = |
237 | 240 | attribute onfocus { common.data.functionbody } |
241 | scripting.attr.onfocusin = | |
242 | attribute onfocusin { common.data.functionbody } | |
243 | scripting.attr.onfocusout = | |
244 | attribute onfocusout { common.data.functionbody } | |
238 | 245 | scripting.attr.onformchange = |
239 | 246 | attribute onformchange { common.data.functionbody } |
240 | 247 | scripting.attr.onformdata = |
313 | 320 | attribute ontimeupdate { common.data.functionbody } |
314 | 321 | scripting.attr.ontoggle = |
315 | 322 | attribute ontoggle { common.data.functionbody } |
323 | scripting.attr.ontransitioncancel = | |
324 | attribute ontransitioncancel { common.data.functionbody } | |
325 | scripting.attr.ontransitionend = | |
326 | attribute ontransitionend { common.data.functionbody } | |
327 | scripting.attr.ontransitionrun = | |
328 | attribute ontransitionrun { common.data.functionbody } | |
329 | scripting.attr.ontransitionstart = | |
330 | attribute ontransitionstart { common.data.functionbody } | |
316 | 331 | scripting.attr.onvolumechange = |
317 | 332 | attribute onvolumechange { common.data.functionbody } |
318 | 333 | scripting.attr.onwaiting = |
26 | 26 | & embedded.content.attrs.crossorigin? |
27 | 27 | ) |
28 | 28 | |
29 | no-alt-img.elem = | |
30 | element img { img.inner & shared-img.attrs } | |
31 | ||
32 | 29 | img.elem = |
33 | 30 | element img { img.inner & img.attrs } |
34 | 31 | img.attrs = |
35 | 32 | ( shared-img.attrs |
36 | & img.attrs.alt | |
33 | & img.attrs.alt? | |
37 | 34 | & ( common.attrs.aria.implicit.img |
38 | | common.attrs.aria.implicit.img | |
39 | 35 | | common.attrs.aria.role.button |
40 | 36 | | common.attrs.aria.role.checkbox |
41 | 37 | | common.attrs.aria.role.img |
111 | 107 | img.inner = |
112 | 108 | empty |
113 | 109 | |
114 | common.elem.phrasing |= no-alt-img.elem | |
115 | 110 | common.elem.phrasing |= img.elem |
116 | 111 | |
117 | 112 | ## Image with multiple sources: <picture> |
233 | 228 | | ondrag |
234 | 229 | | ondragend |
235 | 230 | | ondragenter |
236 | | ondragexit | |
237 | 231 | | ondragleave |
238 | 232 | | ondragover |
239 | 233 | | ondragstart |
243 | 237 | | onended |
244 | 238 | | onerror |
245 | 239 | | onfocus |
240 | | onfocusin | |
241 | | onfocusout | |
246 | 242 | | onformdata |
247 | 243 | | oninput |
248 | 244 | | oninvalid |
279 | 275 | | onsuspend |
280 | 276 | | ontimeupdate |
281 | 277 | | ontoggle |
278 | | ontransitioncancel | |
279 | | ontransitionend | |
280 | | ontransitionrun | |
281 | | ontransitionstart | |
282 | 282 | | onvolumechange |
283 | 283 | | onwaiting |
284 | 284 | | onwheel |
478 | 478 | & iframe.attrs.name? |
479 | 479 | & iframe.attrs.width? |
480 | 480 | & iframe.attrs.height? |
481 | & iframe.attrs.loading? | |
481 | 482 | & iframe.attrs.sandbox? |
482 | 483 | & iframe.attrs.allowfullscreen? |
483 | & iframe.attrs.allowpaymentrequest? | |
484 | & iframe.attrs.allowusermedia? | |
485 | 484 | & iframe.attrs.allow? |
486 | 485 | & referrerpolicy? |
487 | 486 | & ( common.attrs.aria.role.application |
506 | 505 | attribute height { |
507 | 506 | common.data.integer.non-negative |
508 | 507 | } |
508 | iframe.attrs.loading = | |
509 | attribute loading { | |
510 | "lazy" | "eager" | |
511 | } | |
509 | 512 | iframe.attrs.width = |
510 | 513 | attribute width { |
511 | 514 | common.data.integer.non-negative |
517 | 520 | iframe.attrs.allowfullscreen = |
518 | 521 | attribute allowfullscreen { |
519 | 522 | "allowfullscreen" | "" |
520 | } & v5only | |
521 | iframe.attrs.allowpaymentrequest = | |
522 | attribute allowpaymentrequest { | |
523 | "allowpaymentrequest" | "" | |
524 | } & v5only | |
525 | iframe.attrs.allowusermedia = | |
526 | attribute allowusermedia { | |
527 | "allowusermedia" | "" | |
528 | 523 | } & v5only |
529 | 524 | iframe.attrs.allow = |
530 | 525 | attribute allow { |
174 | 174 | | string "audioworklet" |
175 | 175 | | string "document" |
176 | 176 | | string "embed" |
177 | | string "fetch" | |
177 | 178 | | string "font" |
178 | 179 | | string "image" |
179 | 180 | | string "manifest" |
209 | 209 | input.file.attrs &= |
210 | 210 | ( shared-input.attrs |
211 | 211 | & input.attrs.multiple? |
212 | & aria.prop.required? | |
212 | 213 | & input.input.attrs.capture? |
213 | 214 | ) |
214 | 215 | input.input.attrs.capture = |
130 | 130 | attribute in2 { text }, |
131 | 131 | [ a:defaultValue = "normal" ] |
132 | 132 | attribute mode { |
133 | string "normal" | string "multiply" | string "screen" | string "darken" | string "lighten" | |
133 | string "normal" | |
134 | | string "multiply" | |
135 | | string "screen" | |
136 | | string "overlay" | |
137 | | string "darken" | |
138 | | string "lighten" | |
139 | | string "color-dodge" | |
140 | | string "color-burn" | |
141 | | string "hard-light" | |
142 | | string "soft-light" | |
143 | | string "difference" | |
144 | | string "exclusion" | |
145 | | string "hue" | |
146 | | string "saturation" | |
147 | | string "color" | |
148 | | string "luminosity" | |
134 | 149 | }? |
135 | 150 | a:documentation [ |
136 | 151 | "\x{a}" ~ |
183 | 198 | attribute in2 { text }, |
184 | 199 | [ a:defaultValue = "over" ] |
185 | 200 | attribute operator { |
186 | string "over" | string "in" | string "out" | string "atop" | string "xor" | string "arithmetic" | |
201 | string "over" | string "in" | string "out" | string "atop" | string "xor" | string "lighter" | string "arithmetic" | |
187 | 202 | }?, |
188 | 203 | attribute k1 { Number.datatype }?, |
189 | 204 | attribute k2 { Number.datatype }?, |
127 | 127 | opf.href.attr = attribute href { datatype.URI } |
128 | 128 | opf.id.attr = attribute id { datatype.ID } |
129 | 129 | opf.i18n.attrs = opf.xml.lang.attr? & opf.dir.attr? |
130 | opf.xml.lang.attr = attribute xml:lang { datatype.languagecode } | |
130 | opf.xml.lang.attr = attribute xml:lang { "" | datatype.languagecode } | |
131 | 131 | opf.dir.attr = attribute dir { 'ltr' | 'rtl' } |
2 | 2 | |
3 | 3 | <ns uri="http://www.idpf.org/2007/opf" prefix="opf"/> |
4 | 4 | <ns uri="http://purl.org/dc/elements/1.1/" prefix="dc"/> |
5 | ||
6 | <!-- Unique ID checks --> | |
5 | 7 | |
6 | 8 | <pattern id="opf.uid"> |
7 | 9 | <rule context="opf:package[@unique-identifier]"> |
26 | 28 | >dcterms:modified illegal syntax (expecting: "CCYY-MM-DDThh:mm:ssZ")</assert> |
27 | 29 | </rule> |
28 | 30 | </pattern> |
29 | ||
30 | <pattern id="opf.refines.relative"> | |
31 | <rule context="*[@refines and starts-with(normalize-space(@refines),'#')][not(ancestor::opf:collection)]"> | |
32 | <let name="refines-target-id" value="substring(normalize-space(@refines), 2)"/> | |
33 | <assert test="//*[normalize-space(@id)=$refines-target-id]">@refines missing target id: "<value-of | |
34 | select="$refines-target-id"/>"</assert> | |
35 | </rule> | |
36 | </pattern> | |
37 | ||
38 | <pattern id="opf.meta.source-of"> | |
39 | <rule context="opf:meta[normalize-space(@property)='source-of']"> | |
40 | <assert test="normalize-space(.) eq 'pagination'">The "source-of" property must have the | |
41 | value "pagination"</assert> | |
42 | <assert | |
43 | test="exists(@refines) and exists(../dc:source[normalize-space(@id)=substring(normalize-space(current()/@refines),2)])" | |
44 | >The "source-of" property must refine a "dc:source" element.</assert> | |
45 | </rule> | |
46 | </pattern> | |
47 | ||
31 | ||
32 | <!-- Link checks --> | |
33 | ||
48 | 34 | <pattern id="opf.link.record"> |
49 | 35 | <rule context="opf:link[tokenize(@rel,'\s+')='record']"> |
50 | 36 | <assert test="exists(@media-type)">The type of "record" references must be identifiable |
60 | 46 | <assert test="exists(@refines)">"voicing" links must have a "refines" attribute.</assert> |
61 | 47 | </rule> |
62 | 48 | </pattern> |
49 | ||
50 | <!-- Metadata checks --> | |
51 | ||
52 | <pattern id="opf.refines.relative"> | |
53 | <rule context="*[@refines and starts-with(normalize-space(@refines),'#')][not(ancestor::opf:collection)]"> | |
54 | <let name="refines-target-id" value="substring(normalize-space(@refines), 2)"/> | |
55 | <assert test="//*[normalize-space(@id)=$refines-target-id]">@refines missing target id: "<value-of | |
56 | select="$refines-target-id"/>"</assert> | |
57 | </rule> | |
58 | </pattern> | |
59 | ||
60 | <pattern id="opf.dc.subject.authority-term"> | |
61 | <rule context="opf:metadata/dc:subject"> | |
62 | <let name="id" value="normalize-space(./@id)"/> | |
63 | <let name="authority" value="//opf:meta[normalize-space(@property)='authority'][substring(normalize-space(@refines), 2) = $id]"/> | |
64 | <let name="term" value="//opf:meta[normalize-space(@property)='term'][substring(normalize-space(@refines), 2) = $id]"/> | |
65 | <report test="(count($authority) = 1 and count($term) = 0)">A term property must be associated with a dc:subject when an authority is specified</report> | |
66 | <report test="(count($authority) = 0 and count($term) = 1)">An authority property must be associated with a dc:subject when a term is specified</report> | |
67 | <report test="(count($authority) > 1 or count($term) > 1)">Only one pair of authority and term properties can be associated with a dc:subject</report> | |
68 | </rule> | |
69 | </pattern> | |
70 | ||
71 | <pattern id="opf.meta.authority"> | |
72 | <rule context="opf:meta[normalize-space(@property)='authority']"> | |
73 | <assert test="exists(../dc:subject[concat('#',normalize-space(@id)) = normalize-space(current()/@refines)])" | |
74 | >Property "authority" must refine a "subject" property.</assert> | |
75 | <!-- Cardinality is checked in opf.dc.subject.authority-term --> | |
76 | </rule> | |
77 | </pattern> | |
63 | 78 | |
64 | 79 | <pattern id="opf.meta.belongs-to-collection"> |
65 | 80 | <rule context="opf:meta[normalize-space(@property)='belongs-to-collection']"> |
69 | 84 | properties.</assert> |
70 | 85 | </rule> |
71 | 86 | </pattern> |
72 | ||
87 | ||
73 | 88 | <pattern id="opf.meta.collection-type"> |
74 | 89 | <rule context="opf:meta[normalize-space(@property)='collection-type']"> |
75 | 90 | <assert |
76 | 91 | test="exists(../opf:meta[normalize-space(@id)=substring(normalize-space(current()/@refines),2)][normalize-space(@property)='belongs-to-collection'])" |
77 | 92 | >Property "collection-type" must refine a "belongs-to-collection" property.</assert> |
78 | </rule> | |
79 | </pattern> | |
80 | ||
93 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
94 | >Property "collection-type" cannot be declared more than once to refine the same "belongs-to-collection" expression.</report> | |
95 | </rule> | |
96 | </pattern> | |
97 | ||
98 | <pattern id="opf.meta.display-seq"> | |
99 | <rule context="opf:meta[normalize-space(@property)='display-seq']"> | |
100 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
101 | >Property "display-seq" cannot be declared more than once to refine the same expression.</report> | |
102 | </rule> | |
103 | </pattern> | |
104 | ||
105 | <pattern id="opf.meta.file-as"> | |
106 | <rule context="opf:meta[normalize-space(@property)='file-as']"> | |
107 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
108 | >Property "file-as" cannot be declared more than once to refine the same expression.</report> | |
109 | </rule> | |
110 | </pattern> | |
111 | ||
112 | <pattern id="opf.meta.group-position"> | |
113 | <rule context="opf:meta[normalize-space(@property)='group-position']"> | |
114 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
115 | >Property "group-position" cannot be declared more than once to refine the same expression.</report> | |
116 | </rule> | |
117 | </pattern> | |
118 | ||
119 | <pattern id="opf.meta.identifier-type"> | |
120 | <rule context="opf:meta[normalize-space(@property)='identifier-type']"> | |
121 | <assert test="exists(../(dc:identifier|dc:source)[concat('#',normalize-space(@id)) = normalize-space(current()/@refines)])" | |
122 | >Property "identifier-type" must refine an "identifier" or "source" property.</assert> | |
123 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
124 | >Property "identifier-type" cannot be declared more than once to refine the same expression.</report> | |
125 | </rule> | |
126 | </pattern> | |
127 | ||
128 | <pattern id="opf.meta.role"> | |
129 | <rule context="opf:meta[normalize-space(@property)='role']"> | |
130 | <assert test="exists(../(dc:creator|dc:contributor|dc:publisher)[concat('#',normalize-space(@id)) = normalize-space(current()/@refines)])" | |
131 | >Property "role" must refine a "creator", "contributor", or "publisher" property.</assert> | |
132 | </rule> | |
133 | </pattern> | |
134 | ||
135 | <pattern id="opf.meta.source-of"> | |
136 | <rule context="opf:meta[normalize-space(@property)='source-of']"> | |
137 | <assert test="normalize-space(.) eq 'pagination'">The "source-of" property must have the | |
138 | value "pagination"</assert> | |
139 | <assert | |
140 | test="exists(@refines) and exists(../dc:source[normalize-space(@id)=substring(normalize-space(current()/@refines),2)])" | |
141 | >The "source-of" property must refine a "source" property.</assert> | |
142 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
143 | >Property "source-of" cannot be declared more than once to refine the same "source" expression.</report> | |
144 | </rule> | |
145 | </pattern> | |
146 | ||
147 | <pattern id="opf.meta.term"> | |
148 | <rule context="opf:meta[normalize-space(@property)='term']"> | |
149 | <assert test="exists(../dc:subject[concat('#',normalize-space(@id)) = normalize-space(current()/@refines)])" | |
150 | >Property "term" must refine a "subject" property.</assert> | |
151 | <!-- Cardinality is checked in opf.dc.subject.authority-term --> | |
152 | </rule> | |
153 | </pattern> | |
154 | ||
155 | <pattern id="opf.meta.title-type"> | |
156 | <rule context="opf:meta[normalize-space(@property)='title-type']"> | |
157 | <assert test="exists(../dc:title[concat('#',normalize-space(@id)) = normalize-space(current()/@refines)])" | |
158 | >Property "title-type" must refine a "title" property.</assert> | |
159 | <report test="exists(preceding-sibling::opf:meta[normalize-space(@property) = normalize-space(current()/@property)][normalize-space(@refines) = normalize-space(current()/@refines)])" | |
160 | >Property "title-type" cannot be declared more than once to refine the same "title" expression.</report> | |
161 | </rule> | |
162 | </pattern> | |
163 | ||
164 | <!-- Item checks --> | |
81 | 165 | |
82 | 166 | <pattern id="opf.itemref"> |
83 | 167 | <rule context="opf:spine/opf:itemref[@idref]"> |
95 | 179 | <assert test="$item and normalize-space($item/@id) != normalize-space(./@id)">manifest item element fallback attribute |
96 | 180 | must resolve to another manifest item (given reference was "<value-of select="$ref" |
97 | 181 | />")</assert> |
98 | </rule> | |
99 | </pattern> | |
100 | ||
101 | <pattern id="opf.media.overlay"> | |
102 | <rule context="opf:item[@media-overlay]"> | |
103 | <let name="ref" value="./normalize-space(@media-overlay)"/> | |
104 | <let name="item" value="//opf:manifest/opf:item[normalize-space(@id) = $ref]"/> | |
105 | <let name="item-media-type" value="normalize-space($item/@media-type)"/> | |
106 | <assert test="$item-media-type = 'application/smil+xml'">media overlay items must be of | |
107 | the "application/smil+xml" type (given type was "<value-of select="$item-media-type" | |
108 | />")</assert> | |
109 | </rule> | |
110 | </pattern> | |
111 | ||
112 | <pattern id="opf.media.overlay.metadata.global"> | |
113 | <rule context="opf:manifest[opf:item[@media-overlay]]"> | |
114 | <assert test="//opf:meta[normalize-space(@property)='media:duration' and not (@refines)]">global | |
115 | media:duration meta element not set</assert> | |
116 | </rule> | |
117 | </pattern> | |
118 | ||
119 | <pattern id="opf.media.overlay.metadata.item"> | |
120 | <rule context="opf:manifest/opf:item[@media-overlay]"> | |
121 | <let name="mo-idref" value="normalize-space(@media-overlay)"/> | |
122 | <let name="mo-item" value="//opf:item[normalize-space(@id) = $mo-idref]"/> | |
123 | <let name="mo-item-id" value="$mo-item/normalize-space(@id)"/> | |
124 | <let name="mo-item-uri" value="concat('#', $mo-item-id)"/> | |
125 | <assert test="//opf:meta[normalize-space(@property)='media:duration' and normalize-space(@refines) = $mo-item-uri ]">item | |
126 | media:duration meta element not set (expecting: meta property='media:duration' | |
127 | refines='<value-of select="$mo-item-uri"/>')</assert> | |
128 | 182 | </rule> |
129 | 183 | </pattern> |
130 | 184 | |
191 | 245 | (number of "cover-image" items: <value-of select="count($item)"/>).</assert> |
192 | 246 | </rule> |
193 | 247 | </pattern> |
248 | ||
249 | <!-- Rendition properties checks --> | |
194 | 250 | |
195 | 251 | <pattern id="opf.rendition.globals"> |
196 | 252 | <rule context="opf:package/opf:metadata"> |
301 | 357 | |
302 | 358 | <include href="./mod/id-unique.sch"/> |
303 | 359 | |
360 | <!-- Media overlay checks --> | |
361 | ||
362 | <pattern id="opf.duration.metadata.item"> | |
363 | <rule context="opf:meta[normalize-space(@property)='media:duration']"> | |
364 | <assert | |
365 | test="matches(normalize-space(),'^(([0-9]+:[0-5][0-9]:[0-5][0-9](\.[0-9]+)?)|((\s*)[0-5][0-9]:[0-5][0-9](\.[0-9]+)?(\s*))|((\s*)[0-9]+(\.[0-9]+)?(h|min|s|ms)?(\s*)))$')" | |
366 | >The value of the media:duration property must be a valid SMIL3 clock value</assert> | |
367 | </rule> | |
368 | </pattern> | |
369 | ||
370 | <pattern id="opf.media.overlay"> | |
371 | <rule context="opf:item[@media-overlay]"> | |
372 | <let name="ref" value="./normalize-space(@media-overlay)"/> | |
373 | <let name="item" value="//opf:manifest/opf:item[normalize-space(@id) = $ref]"/> | |
374 | <let name="item-media-type" value="normalize-space($item/@media-type)"/> | |
375 | <let name="media-type" value="normalize-space(@media-type)"/> | |
376 | <assert test="$item-media-type = 'application/smil+xml'">media overlay items must be of | |
377 | the "application/smil+xml" type (given type was "<value-of select="$item-media-type" | |
378 | />")</assert> | |
379 | <assert test="$media-type='application/xhtml+xml' or $media-type='image/svg+xml'" | |
380 | >The media-overlay attribute is only allowed on XHTML and SVG content documents.</assert> | |
381 | </rule> | |
382 | </pattern> | |
383 | ||
384 | <pattern id="opf.media.overlay.metadata.global"> | |
385 | <rule context="opf:manifest[opf:item[@media-overlay]]"> | |
386 | <assert test="//opf:meta[normalize-space(@property)='media:duration' and not (@refines)]">global | |
387 | media:duration meta element not set</assert> | |
388 | </rule> | |
389 | </pattern> | |
390 | ||
391 | <pattern id="opf.media.overlay.metadata.item"> | |
392 | <rule context="opf:manifest/opf:item[@media-overlay]"> | |
393 | <let name="mo-idref" value="normalize-space(@media-overlay)"/> | |
394 | <let name="mo-item" value="//opf:item[normalize-space(@id) = $mo-idref]"/> | |
395 | <let name="mo-item-id" value="$mo-item/normalize-space(@id)"/> | |
396 | <let name="mo-item-uri" value="concat('#', $mo-item-id)"/> | |
397 | <assert test="//opf:meta[normalize-space(@property)='media:duration' and normalize-space(@refines) = $mo-item-uri ]">item | |
398 | media:duration meta element not set (expecting: meta property='media:duration' | |
399 | refines='<value-of select="$mo-item-uri"/>')</assert> | |
400 | </rule> | |
401 | </pattern> | |
402 | ||
403 | <pattern id="opf.media.overlay.metadata.active-class"> | |
404 | <rule context="opf:meta[normalize-space(@property)='media:active-class']"> | |
405 | <report test="@refines"> @refines must not be used with the media:active-class property</report> | |
406 | </rule> | |
407 | </pattern> | |
408 | ||
409 | <pattern id="opf.media.overlay.metadata.playback-active-class"> | |
410 | <rule context="opf:meta[normalize-space(@property)='media:playback-active-class']"> | |
411 | <report test="@refines"> @refines must not be used with the media:playback-active-class property</report> | |
412 | </rule> | |
413 | </pattern> | |
414 | ||
415 | ||
416 | ||
304 | 417 | <!-- EPUB 3.2 New Checks --> |
305 | 418 | |
306 | 419 | <pattern id="opf.spine.duplicate.refs"> |
309 | 422 | </rule> |
310 | 423 | </pattern> |
311 | 424 | |
312 | <pattern id="opf.subject.authority-term"> | |
313 | <rule context="opf:metadata/dc:subject"> | |
314 | <let name="id" value="normalize-space(./@id)"/> | |
315 | <let name="authority" value="//opf:meta[normalize-space(@property)='authority'][substring(normalize-space(@refines), 2) = $id]"/> | |
316 | <let name="term" value="//opf:meta[normalize-space(@property)='term'][substring(normalize-space(@refines), 2) = $id]"/> | |
317 | <report test="(count($authority) = 1 and count($term) = 0)">A term property must be associated with a dc:subject when an authority is specified</report> | |
318 | <report test="(count($authority) = 0 and count($term) = 1)">An authority property must be associated with a dc:subject when a term is specified</report> | |
319 | <report test="(count($authority) > 1 or count($term) > 1)">Only one pair of authority and term properties can be associated with a dc:subject</report> | |
320 | </rule> | |
321 | </pattern> | |
322 | ||
323 | 425 | <!-- EPUB 3.2 Deprecated Features --> |
324 | 426 | |
325 | 427 | <pattern id="opf.bindings.deprecated"> |
27 | 27 | min_exclusive_violation=espressione non valida: {0} deve essere maggiore di {1} |
28 | 28 | max_inclusive_violation=espressione non valida: {0} deve essere minore o uguale a {1} |
29 | 29 | max_exclusive_violation=espressione non valida: {0} deve essere minore di {1} |
30 | pattern_violation=espressione non valida: {0} non soddisfa l''espressione regolare {1} | |
30 | pattern_violation=espressione non valida: {0} non soddisfa l''''espressione regolare {1} | |
31 | 31 | entity_violation=espressione non valida: {0} deve essere un nome dichiarato nel DTD come \"unparsed entity\" |
32 | 32 | undeclared_prefix=espressione non valida: {0} deve essere un nome qualificato dichiarato |
33 | 33 | precision_violation=espressione non valida: {0} deve avere almeno {1} cifre decimali (ora: {2}) |
0 | #!c:\python27\python | |
1 | ||
2 | import os | |
3 | import sys | |
4 | import datetime | |
5 | import time | |
6 | import webbrowser | |
7 | import urllib | |
8 | import optparse | |
9 | import subprocess | |
10 | import zipfile | |
11 | import shutil | |
12 | import json | |
13 | import functools | |
14 | import Dictionary | |
15 | import CompareResults | |
16 | ||
17 | defaultJarName = os.path.join(os.path.dirname(os.path.realpath(__file__)), r"epubcheck.jar") | |
18 | ||
19 | def parse_args(argv): | |
20 | prog_dir = os.path.dirname(argv[0]) | |
21 | usage = """ | |
22 | Usage: %s [OPTION] | |
23 | BookReporter: ePubCheck all ePub files in the target directory, potentially preserving | |
24 | generated .JSON output files, compare results to prior checks if old results are found. | |
25 | Optional (use --ppDiffs) pretty-prints any jsondiffs.json files found in the json directory. | |
26 | """[1:-1] % os.path.basename(argv[0]) | |
27 | ||
28 | parser = optparse.OptionParser(usage=usage) | |
29 | parser.add_option("-d", "--directory", dest="target", type="str", default=".", | |
30 | help="Directory on which ePubCheck will be run, default is the current working directory") | |
31 | parser.add_option("-f", "--file", dest="targetFile", type="str", default="", | |
32 | help=r'''File or comma separated list of files to run the check on; if -f is omitted, all files | |
33 | in the target directory will be checked. If you include a fully qualified path to a file, you can add additional comma | |
34 | separated file names in the same directory named on the first file (-f /path/a.epub,b.epub). | |
35 | ''') | |
36 | parser.add_option("--NoSaveJson", action="store_false", dest="saveJson", default=True, | |
37 | help=r"Do NOT save ePubcheck .json output files") | |
38 | parser.add_option("--NoCompareJson", action="store_false", dest="compareJson", default=True, | |
39 | help=r"Do NOT compare the json created during this check with the most recently saved .json result, if found.") | |
40 | parser.add_option("--EanOnlyJsonNames", action="store_true", dest="jsonNamedByEAN", default=False, | |
41 | help=r"Use this flag to force .json file names to use EAN-only naming convention, <ean>.ePubCheck.json, not <file_name>.ePubCheck.json names. Files not conforming to EAN-first naming pattern will use the <file_name> convention") | |
42 | parser.add_option("-j", "--jsonDir", dest="jsonDir", type="str", | |
43 | default=r"", | |
44 | help=r"if the -s switch is used, ePubCheck .json output files will be preserved in either the location specified by the -e switch, or if -e is omitted, stored in <targetDir>\NOOKePubCheckJson") | |
45 | parser.add_option("--ppJson", dest="ppJson", action="store_true", default=False, | |
46 | help=r"Skip checks and 'pretty print' any json files in the target directory's json output directory.") | |
47 | parser.add_option("--ppDiffs", dest="ppDiffs", action="store_true", default=False, | |
48 | help=r"Skip checks and simply 'pretty-print' any jsondiffs.json files found in the target directory's json output directory.") | |
49 | parser.add_option("-v", "--verbose", dest="verbose", action="store_true", default=False, | |
50 | help=r"Show all messages grouped by type") | |
51 | parser.add_option("-q", "--Hide_errors", dest="showErrors", action="store_false", default=True, | |
52 | help=r"'Quiet' output mode; don't list FATAL and ERROR messages; by default these errors are always displayed on the console") | |
53 | parser.add_option("-w", "--warning", dest="showWarning", action="store_true", default=False, | |
54 | help=r"Show WARNING messages; by default, these messages are not shown on the console") | |
55 | parser.add_option("-u", "--usage", dest="showUsage", action="store_true", default=False, | |
56 | help=r"Show USAGE messages, by default, these messages are not shown on the console") | |
57 | ||
58 | parser.add_option("-l", "--logging", dest="loggingFlag", default=False, action="store_true", | |
59 | help=r"Enable logging to a tab-delimited file") | |
60 | parser.add_option("--logdir", dest="logdir", type="str", | |
61 | default=r"$EPUBCHECK-LOGS", | |
62 | help=r'''Log file location used by this tool, defaults to the value of the environment variable "EPUBCHECK-LOGS"; | |
63 | if EPUBCHECK-LOGS is defined and a valid directory, logging is enabled to that directory automatically (-l is not required if EPUBCHECK-LOGS is defined) | |
64 | if EPUBCHECK-LOGS is undefined and logging is enabled, logs are written in the current working directory. | |
65 | if EPUBCHECK-LOGS is defined, automatic logging can be disabled by using the "--logdir none" switch. | |
66 | ''') | |
67 | parser.add_option("--logfile", dest="logfile", type="str", | |
68 | default=r"NookReporter.TabDelimitedFile", | |
69 | help=r"Log file name used by this tool, default=BookReporter.TabDelimitedFile") | |
70 | parser.add_option("--customCheckMessages", dest="overrideFile", type="str", default="$ePubCheckCustomMessageFile", | |
71 | help=r'''Name of a custom ePubCheck message file for use in these checks. | |
72 | If not specified, the value of the environment variable $ePubCheckCustomMessages file will be used, if defined. | |
73 | To override the value of that environment variable, \use "--customCheckMessages=<filePath?" to use an alternate file, | |
74 | or use "--customChckMessages=none" to run ePubCheck with the default set of check message severities. | |
75 | ''') | |
76 | parser.add_option("--applicationJar", dest="appJar", type="str", | |
77 | default=defaultJarName, | |
78 | help=r"if specified,the named jar will be used; if not specified, " + defaultJarName + " in this script's directory will be used") | |
79 | parser.add_option("--jarArgs", dest="jarArgs", type="str", | |
80 | default=r"", | |
81 | help=r'''Any args specified with the --jarArgs switch will be passed to the applicationJar (ePubCheck); by default, no jarArgs are passed. | |
82 | Note that the BookReporter adds several ePubCheck command line switches automatically, including -j, -mode exp, and possibly others. | |
83 | Any parameters specified with --jarArgs will be appended to the command line, and could conflict with the switches added automatically. | |
84 | If the parameters include spaces, quote the --jarArgs parameter string. | |
85 | ''') | |
86 | parser.add_option("--timeout", dest="timeoutVal", type="int", default=30, | |
87 | help=r"Abort a ePubCheck process that takes longer than the --timeout nnn in seconds. NOTE: This setting is ignored unless you are using Python 3.3 or later") | |
88 | opts,args = parser.parse_args(argv[1:]) | |
89 | return opts,args | |
90 | ||
91 | def fileMD5(fileName): | |
92 | import hashlib | |
93 | md5 = hashlib.md5() | |
94 | block_size = 128*md5.block_size | |
95 | with open(fileName,'rb') as f: | |
96 | for chunk in iter(functools.partial(f.read, block_size), b''): | |
97 | md5.update(chunk) | |
98 | return md5.hexdigest() | |
99 | ||
100 | def checkForException(outputFile, targetString): | |
101 | with open(outputFile, 'r') as f: | |
102 | for line in f: | |
103 | #print(" debug: log file line: " + line.rstrip("\n")) | |
104 | if targetString in line: | |
105 | f.close() | |
106 | return True | |
107 | f.close() | |
108 | return False | |
109 | ||
110 | ||
111 | def logMessages(theMessages, showType): | |
112 | nMessage=0 | |
113 | for message in theMessages: | |
114 | if message["severity"] == showType: | |
115 | nMessage += 1 | |
116 | if nMessage == 1: | |
117 | print (" " + showType + " messages:") | |
118 | nTimes = len(message["locations"]) | |
119 | if message["additionalLocations"] != 0: nTimes += message["additionalLocations"] - 1 | |
120 | timesStr = " (" + str(nTimes) + " occurrence" | |
121 | if nTimes == 1: | |
122 | timesStr += ")" | |
123 | else: | |
124 | timesStr += "s)" | |
125 | try: | |
126 | print (" " + str(nMessage) + ": " + message["ID"] + ": " + message["message"] + timesStr) | |
127 | except: | |
128 | print (" " + str(nMessage) + ": " + message["ID"] + ": " + urllib.quote(message["message"].encode('utf-8')) + timesStr) | |
129 | if gopts.verbose: | |
130 | for loc in message["locations"]: | |
131 | if loc["line"] != -1 or loc["column"] != -1: | |
132 | locString = ": line: " + str(loc["line"]) + " col: " + str(loc["column"]) | |
133 | else: | |
134 | locString = "" | |
135 | try: | |
136 | print(" " + loc["fileName"] + locString) | |
137 | except: | |
138 | print(" " + urllib.quote(loc["fileName"].encode('utf-8')) + urllib.quote(locString.encode('utf-8'))) | |
139 | print("") | |
140 | if nMessage > 0: | |
141 | print (" --") | |
142 | ||
143 | def listServerFiles(theDir): | |
144 | theFiles = [] | |
145 | os.chdir(theDir) | |
146 | files = os.listdir(".") | |
147 | for file in files: | |
148 | theFiles.append(string.strip(file)) | |
149 | return(theFiles) | |
150 | ||
151 | def logStats(log_file, checkerVersion, book_dir, book_file, book_path, elapsedTime, checkTime, ePubVersion, comparedTo, checkChanged, pubChanged, spineChanged, manifestChanged, messagesChanged, numFatal, numError, isScripted, hasFixedFormat): | |
152 | ||
153 | now = datetime.datetime.now() | |
154 | dateTime = str(now.date()) + "\t" + str(now.time()) | |
155 | ||
156 | if not os.path.exists(log_file): | |
157 | print ("File " + log_file + " doesn't exist, inititalizing file...") | |
158 | f = open(log_file, 'a') | |
159 | f.write("logDate\t") | |
160 | f.write("logTime\t") | |
161 | f.write("logTool\t") | |
162 | f.write("checkerVersion\t") | |
163 | f.write("PubDir\t") | |
164 | f.write("ePubFile\t") | |
165 | f.write("ePubPath\t") | |
166 | f.write("elpasedTime\t") | |
167 | f.write("checkTime\t") | |
168 | f.write("ePubVersion\t") | |
169 | f.write("comparedTo\t") | |
170 | f.write("checkChanged\t") | |
171 | f.write("pubChanged\t") | |
172 | f.write("spineChanged\t") | |
173 | f.write("manifestChanged\t") | |
174 | f.write("messagesChanged\t") | |
175 | f.write("numFatal\t") | |
176 | f.write("numErrors\t") | |
177 | f.write("isScripted\t") | |
178 | f.write("hasFixedFormat\n") | |
179 | else: | |
180 | f = open(log_file, 'a') | |
181 | ||
182 | f.write(str(now.date()) + "\t" + str(now.time()) + "\t") | |
183 | f.write("BookReporter.py" + "\t") | |
184 | f.write(checkerVersion + "\t") | |
185 | f.write(book_dir + "\t") | |
186 | f.write(book_file.rstrip("\r") + "\t") | |
187 | f.write(book_path + "\t") | |
188 | f.write(elapsedTime + "\t") | |
189 | f.write(checkTime + "\t") | |
190 | f.write(ePubVersion + "\t") | |
191 | f.write(comparedTo + "\t") | |
192 | f.write(checkChanged + "\t") | |
193 | f.write(pubChanged + "\t") | |
194 | f.write(spineChanged + "\t") | |
195 | f.write(manifestChanged + "\t") | |
196 | f.write(messagesChanged + "\t") | |
197 | f.write(numFatal + "\t") | |
198 | f.write(numError + "\t") | |
199 | f.write(isScripted + "\t") | |
200 | f.write(hasFixedFormat + "\n") | |
201 | f.close() | |
202 | ||
203 | return "BookReporter Logging complete..." | |
204 | ||
205 | def ppEpubCheckChanges(jsonDelta): | |
206 | # | |
207 | # print the check metadata plus the publication and manifest item metadata changes | |
208 | # | |
209 | oldCheck = jsonDelta["summary"]["oldCheck"] | |
210 | newCheck = jsonDelta["summary"]["newCheck"] | |
211 | print(" ePubCheck results comparison") | |
212 | print(" --\n\r") | |
213 | print(" New file: " + newCheck["path"] + " checked on: " + newCheck["checkDate"]) | |
214 | # | |
215 | # json schema change: moved file name from publication to checker; if the name isn't found, use one in publication; this hack can be removed soon. | |
216 | # | |
217 | try: | |
218 | print(" Old file: " + oldCheck["path"] + " checked on: " + oldCheck["checkDate"]) | |
219 | except: | |
220 | print(" Old file: path not found due to json output schema change; the old file was checked on: " + oldCheck["checkDate"]) | |
221 | print(" --\n\r") | |
222 | print(" Summary: publication metadata changes: " + str(jsonDelta["summary"]["publicationChanges"])) | |
223 | print(" spine item changes: " + str(jsonDelta["summary"]["spineChanges"])) | |
224 | print(" manifest item changes: " + str(jsonDelta["summary"]["itemChanges"])) | |
225 | print(" --\n\r") | |
226 | if jsonDelta["summary"]["publicationChanges"] > 0: | |
227 | print(" Publication property changes:") | |
228 | pubChanges = jsonDelta["publication"] | |
229 | if len(pubChanges["adds"]) > 0: | |
230 | print(" New properties added: ") | |
231 | for item in pubChanges["adds"]: | |
232 | print(" '" + item + "' (value: " + str(pubChanges["adds"][item]) + ")") | |
233 | if len(pubChanges["cuts"]) > 0: | |
234 | print(" Properties removed: ") | |
235 | for item in pubChanges["cuts"]: | |
236 | print(" '" + item + "' (value was: " + str(pubChanges["cuts"][item]) + ")") | |
237 | if len(pubChanges["changes"]) > 0: | |
238 | print(" Properties changed: ") | |
239 | for item in pubChanges["changes"]: | |
240 | if item == "embeddedFonts": | |
241 | print(" " + item + " changed -- new embedded font list:") | |
242 | for fontString in pubChanges["changes"][item]["newValue"]: | |
243 | print(" " + str(fontString)) | |
244 | print(" " + item + " changed -- old embedded font list:") | |
245 | for fontString in pubChanges["changes"][item]["oldValue"]: | |
246 | print(" " + str(fontString)) | |
247 | else: | |
248 | try: | |
249 | print(" '" + item + "'changed -- new value: '" + str(pubChanges["changes"][item]["newValue"]) + "'; old value: '" + str(pubChanges["changes"][item]["oldValue"]) + "'") | |
250 | except: | |
251 | print(" '" + item + "'changed -- exception occurred trying to render old or new property value") | |
252 | else: | |
253 | print(" Publication property changes: NONE") | |
254 | ||
255 | print(" --\r\n") | |
256 | ||
257 | if jsonDelta["summary"]["spineChanges"] > 0: | |
258 | print(" Spine changes:") | |
259 | spineChanges = jsonDelta["spine"] | |
260 | print(" Unchanged spine items: " + str(spineChanges["unchanged"])) | |
261 | if len(spineChanges["adds"]) > 0: | |
262 | print(" New spine items added: ") | |
263 | for item in spineChanges["adds"]: | |
264 | print(" '" + item + "' order: " + str(spineChanges["adds"][item])) | |
265 | if len(spineChanges["cuts"]) > 0: | |
266 | print(" Spine items removed: ") | |
267 | for item in spineChanges["cuts"]: | |
268 | print(" '" + item + "' order was: " + str(spineChanges["cuts"][item])) | |
269 | if len(spineChanges["orderChanges"]) > 0: | |
270 | print(" Spine items reordered: ") | |
271 | for item in spineChanges["orderChanges"]: | |
272 | print(" '" + item + "' spine order changed -- new order: '" + str(spineChanges["orderChanges"][item]["newSpineIndex"]) + "'; old order: '" + str(spineChanges["orderChanges"][item]["oldSpineIndex"]) + "'") | |
273 | ||
274 | if len(spineChanges["contentChanges"]) > 0: | |
275 | print(" Spine item content changes: ") | |
276 | for item in spineChanges["contentChanges"]: | |
277 | print(" spine ID: '" + item + "' file: '" + str(spineChanges["contentChanges"][item]) + "'") | |
278 | else: | |
279 | print(" Publication spine changes: NONE") | |
280 | ||
281 | print(" --\r\n") | |
282 | ||
283 | if jsonDelta["summary"]["itemChanges"] > 0: | |
284 | print(" Publication manifest item changes:") | |
285 | maniChanges = jsonDelta["manifest"] | |
286 | ||
287 | if len(maniChanges["adds"]) > 0: | |
288 | print(" Manifest items added: ") | |
289 | for item in maniChanges["adds"]: | |
290 | print(" '" + item + "'") | |
291 | if gopts.verbose: | |
292 | for property in maniChanges["adds"][item]: | |
293 | if property == "referencedItems": | |
294 | print(" referenced items: ") | |
295 | for references in maniChanges["adds"][item][property]: | |
296 | print(" " + references) | |
297 | else: | |
298 | try: | |
299 | print(" property: " + property + " -- value: " + str(maniChanges["adds"][item][property])) | |
300 | except: | |
301 | print(" property: " + property + " -- value caused exception during output") | |
302 | ||
303 | if len(maniChanges["cuts"]) > 0: | |
304 | print("\r\n Manifest items removed: ") | |
305 | for item in maniChanges["cuts"]: | |
306 | print(" '" + item + "'") | |
307 | if gopts.verbose: | |
308 | for property in maniChanges["cuts"][item]: | |
309 | if property == "referencedItems": | |
310 | print(" referenced items: ") | |
311 | for references in maniChanges["cuts"][item][property]: | |
312 | print(" " + references) | |
313 | else: | |
314 | try: | |
315 | print(" property: " + property + " -- value: " + str(maniChanges["cuts"][item][property])) | |
316 | except: | |
317 | print(" property: " + property + " -- value caused exception during output") | |
318 | ||
319 | if len(maniChanges["changes"]) > 0: | |
320 | theChanges = maniChanges["changes"] | |
321 | print("\r\n Manifest item property changes: ") | |
322 | for itemId in theChanges: | |
323 | if "adds" in theChanges[itemId]: | |
324 | for theAdd in theChanges[itemId]["adds"]: | |
325 | print(" Manifest item ID: '" + itemId + "' -- property '" + theAdd + "' was added to the manifest item (value: " + str(theChanges[itemId]["adds"][theAdd]) + ")") | |
326 | if "cuts" in theChanges[itemId]: | |
327 | for theCut in theChanges[itemId]["cuts"]: | |
328 | print(" Manifest item ID: '" + itemId + "' -- property '" + theCut + "' was removed from the manifest item (value: " + str(theChanges[itemId]["cuts"][theCut]) +")") | |
329 | if "changes" in theChanges[itemId]: | |
330 | for property in theChanges[itemId]["changes"]: | |
331 | if property == "referencedItems": | |
332 | if "adds" in theChanges[itemId]["changes"][property]: | |
333 | for theAdd in theChanges[itemId]["changes"][property]["adds"]: | |
334 | try: | |
335 | print(" Manifest item ID: '" + itemId + "' -- added reference to: '" + theAdd +"'") | |
336 | except: | |
337 | print(" Manifest item ID: '" + itemId + "' -- added reference to: exception thrown during output of the added reference") | |
338 | if "cuts" in theChanges[itemId]["changes"][property]: | |
339 | for theCut in theChanges[itemId]["changes"][property]["cuts"]: | |
340 | try: | |
341 | print(" Manifest item ID: '" + itemId + "' -- removed reference to: '" + theCut +"'") | |
342 | except: | |
343 | print(" Manifest item ID: '" + itemId + "' -- added reference to: exception thrown during output of the cut reference") | |
344 | elif property == "checkSum": | |
345 | try: | |
346 | print(" Manifest item ID: '" + itemId + "' -- the associated file '" + str(theChanges[itemId]["changes"][property]["newValue"]) +"' contents changed") | |
347 | except: | |
348 | print(" Manifest item ID: '" + itemId + "' -- the associated file (named caused exception on output) contents changed") | |
349 | else: | |
350 | try: | |
351 | print(" Manifest item ID: '" + itemId + "' -- property '" + property + "' changed -- newValue: " + str(theChanges[itemId]["changes"][property]["newValue"]) + "; oldValue: " + str(theChanges[itemId]["changes"][property]["oldValue"])) | |
352 | except: | |
353 | print(" Manifest item ID: '" + itemId + "' -- property '" + property + "' changed -- old or new property value caused exception during output") | |
354 | else: | |
355 | print(" Publication manifest changes: NONE") | |
356 | print(" --\r\n") | |
357 | ||
358 | def compareMessages(newMessages, oldMessages): | |
359 | if newMessages == oldMessages: | |
360 | print (" Message collections are identical you idiot") | |
361 | messDelta = "Mess=!" | |
362 | else: | |
363 | print (" Message collections comparisons:") | |
364 | messDelta = "Mess-" | |
365 | lenNew = len(newMessages) | |
366 | lenOld = len(oldMessages) | |
367 | oldItems = set() | |
368 | oldDict = {} | |
369 | newItems = set() | |
370 | newDict = {} | |
371 | blankIDinNew = 0 | |
372 | blankIDinOld = 0 | |
373 | for item in newMessages: | |
374 | key = item["ID"] | |
375 | if key == "": | |
376 | blankIDinNew += 1 | |
377 | key = "BlankID_" + str(blankIDinNew) | |
378 | if not key in newItems: | |
379 | newItems.add(key) | |
380 | newDict[key] = item | |
381 | ''' | |
382 | else: | |
383 | print (" Message collections comparison: ID Collision in NEW Messages: " + key) | |
384 | ''' | |
385 | for item in oldMessages: | |
386 | key = item["ID"] | |
387 | if key == "": | |
388 | blankIDinOld += 1 | |
389 | key = "BlankID_" + str(blankIDinOld) | |
390 | if not key in oldItems: | |
391 | oldItems.add(key) | |
392 | oldDict[key] = item | |
393 | ''' | |
394 | else: | |
395 | print (" Message collections comparison: ID Collision in OLD Messages: " + key) | |
396 | ''' | |
397 | ||
398 | delta = Dictionary.DictCompare(newDict, oldDict) | |
399 | adds = delta.added() | |
400 | cuts = delta.removed() | |
401 | changes = delta.changed() | |
402 | if len(adds) != 0: | |
403 | messDelta += "A" + str(len(adds)) + "-" | |
404 | for id in adds: | |
405 | print (" Message ID: " + id + " added") | |
406 | else: | |
407 | messDelta += "xA-" | |
408 | print (" No Message items added") | |
409 | if len(cuts) != 0: | |
410 | messDelta += "R" + str(len(cuts)) + "-" | |
411 | for id in cuts: | |
412 | print (" Message ID: " + id + " removed") | |
413 | else: | |
414 | messDelta += "xR-" | |
415 | print (" No Message items removed") | |
416 | if len(changes) != 0: | |
417 | messDelta += "C" + str(len(changes)) | |
418 | for id in changes: | |
419 | newRecord = newDict[id] | |
420 | oldRecord = oldDict[id] | |
421 | recDelta = Dictionary.DictCompare(newRecord, oldRecord) | |
422 | for name in recDelta.added(): | |
423 | print (' Message ID: "' + id + '" -- property value "' + name + '" added; value: "' + str(newRecord[name]) + '"') | |
424 | for name in recDelta.removed(): | |
425 | print (' Message ID: "' + id + '" -- property value "' + name + '" removed; old value was: "' + str(oldRecord[name]) + '"') | |
426 | for name in recDelta.changed(): | |
427 | if name == "locations": | |
428 | print (' Message ID: "' + id + '" -- property value "' + name + '" changed; new occurrence count: ' + str(len(newRecord[name])) + '; old occurrence count: ' + str(len(oldRecord[name]))) | |
429 | newLocs = set() | |
430 | oldLocs = set() | |
431 | oldLocsDict = {} | |
432 | newLocsDict = {} | |
433 | for locs in newRecord[name]: | |
434 | if not locs["fileName"]: | |
435 | locs["fileName"]= id + "HasNullFileNameLoc" | |
436 | locID = locs["fileName"] + "-" + str(locs["line"]) + "-" + str(locs["column"]) | |
437 | if not locID in newLocs: | |
438 | newLocs.add(locID) | |
439 | newLocsDict[locID] = locs | |
440 | else: | |
441 | print (" Messages: duplicate location in NEW message collection for message ID: " + id + " Location: " + locs["fileName"] + "@ " + str(locs["line"]) + ":" + str(locs["column"])) | |
442 | for locs in oldRecord[name]: | |
443 | if not locs["fileName"]: | |
444 | locs["fileName"]= id + "HasNullFileNameLoc" | |
445 | locID = locs["fileName"] + "-" + str(locs["line"]) + "-" + str(locs["column"]) | |
446 | if not locID in oldLocs: | |
447 | oldLocs.add(locID) | |
448 | oldLocsDict[locID] = locs | |
449 | else: | |
450 | print (" Messages: duplicate location in OLD message collection for message ID: " + id + " Location: " + locs["fileName"] + " @ " + str(locs["line"]) + ":" + str(locs["column"])) | |
451 | locsDelta = Dictionary.DictCompare(newLocsDict, oldLocsDict) | |
452 | for locs in locsDelta.added(): | |
453 | try: | |
454 | print (' Message ID: "' + id + '" -- location added; value: "' + newLocsDict[locs]['fileName'] + ' @ ' + str(newLocsDict[locs]['line']) + ':' + str(newLocsDict[locs]['column']) + '"') | |
455 | except: | |
456 | print (' Message ID: "' + id + '" -- location added; value: caused exception on output') | |
457 | ||
458 | for locs in locsDelta.removed(): | |
459 | try: | |
460 | print (' Message ID: "' + id + '" -- location removed; value: ' + oldLocsDict[locs]['fileName'] + ' @ ' + str(oldLocsDict[locs]['line']) + ':' + str(oldLocsDict[locs]['column']) +'"') | |
461 | except: | |
462 | print (' Message ID: "' + id + '" -- location removed; value: caused exception on output') | |
463 | ''' | |
464 | for locs in locsDelta.changed(): | |
465 | print (' Manifest ID: "' + id + '" -- property value "' + name + '" changed; new value: "' + str(newRecord[name]) + '"; old value: "' + str(oldRecord[name]) + '"') | |
466 | ''' | |
467 | else: | |
468 | print (' Message ID: "' + id + '" -- property value "' + name + '" changed; new value: "' + str(newRecord[name]) + '"; old value: "' + str(oldRecord[name]) + '"') | |
469 | ||
470 | else: | |
471 | messDelta += "xC" | |
472 | print (" No Message item properties were changed") | |
473 | if blankIDinNew != 0: | |
474 | print (" " + str(blankIDinNew) + " blank Message item IDs in new Messages found") | |
475 | if blankIDinOld != 0: | |
476 | print (" " + str(blankIDinOld) + " blank Message item IDs in old Messages found") | |
477 | print (" --") | |
478 | return(messDelta) | |
479 | ||
480 | print ("BookReporter Tool: Check ePubs and optionally compare check results to a previous check") | |
481 | global gopts | |
482 | gopts,args = parse_args(sys.argv) | |
483 | ||
484 | # print (str(gopts.target)) | |
485 | # | |
486 | # set up logging of checking activity | |
487 | # | |
488 | logging = gopts.loggingFlag | |
489 | if (logging or os.path.isdir(os.path.expandvars(gopts.logdir)) != "") and os.path.expandvars(gopts.logdir) != "none": | |
490 | logging = True | |
491 | # | |
492 | # verify that the log dir and file are writable | |
493 | # | |
494 | if not os.path.isdir(os.path.expandvars(gopts.logdir)): | |
495 | print ('Activity logging directory "' + os.path.expandvars(gopts.logdir) + '" is not a valid dir, logging to the current working directory') | |
496 | statsLog = os.path.join(".", gopts.logfile) | |
497 | else: | |
498 | statsLog = os.path.join(os.path.expandvars(gopts.logdir), gopts.logfile) | |
499 | print ("Activity logging is being performed to " + statsLog) | |
500 | ||
501 | # | |
502 | # find the epubcheck jar, or give up... | |
503 | # | |
504 | if os.path.exists(gopts.appJar): | |
505 | ePubCheckCmd = "java -jar " + gopts.appJar | |
506 | else: | |
507 | print ("'" + gopts.appJar + "' was not found; BookReporter.py is aborting...") | |
508 | exit(1) | |
509 | #print ("BookReporter is using the java command: " + ePubCheckCmd) | |
510 | ||
511 | # | |
512 | # decide if this is python 2.7 or 3.3 and whether to use the timeout= flag on subprocess.call | |
513 | # | |
514 | ||
515 | useTimeout = False | |
516 | pyVersionString = sys.version | |
517 | pyVersion = float(pyVersionString.split(' ')[0].split(".")[0] + "." + pyVersionString.split(' ')[0].split(".")[1]) | |
518 | if pyVersion >= 3.3: | |
519 | useTimeout = True | |
520 | print("BookReporter is running on Python version: " + str(pyVersion) + " and is using a check time limit of: " + str(gopts.timeoutVal) + "s") | |
521 | else: | |
522 | print("BookReporter is running on Python version: " + str(pyVersion) + "; no check time limit is being enforced. Some books can take > 5 minutes to check...") | |
523 | ||
524 | ||
525 | # | |
526 | # ensure that targetDir is actually a directory... | |
527 | # | |
528 | targetDir = gopts.target | |
529 | if not os.path.isdir(targetDir): | |
530 | print ("-d " + targetDir + " is not a directory, aborting...") | |
531 | sys.exit(1) | |
532 | ||
533 | ppDiffs = gopts.ppDiffs | |
534 | ppJson = gopts.ppJson | |
535 | # | |
536 | # listing will contain the list of files to check, either ePubs in targetDir or .jsondiffs.json or epubcheck.json files in the json output directory of either --ppDiffs or --ppJson are used | |
537 | # | |
538 | if gopts.overrideFile == "" or gopts.overrideFile == "none": | |
539 | print("BookReporter is using NO ePubCheck custom message file... even if one is specified by the environment variable $ePubCheckCustomMessageFile") | |
540 | overrideCmdStr = "" | |
541 | else: | |
542 | overrideFile = os.path.expandvars(gopts.overrideFile) | |
543 | if overrideFile != "" and os.path.exists(overrideFile): | |
544 | print("BookReporter is using the ePubCheck custom message file: " + overrideFile) | |
545 | overrideCmdStr = ' -c "' + overrideFile + '" ' | |
546 | print("BookReporter override command string= '" + overrideCmdStr +"'") | |
547 | else: | |
548 | print("BookReporter could not file the --customMessageFile file: " + overrideFile + "; check continuing without an override file...") | |
549 | overrideCmdStr = "" | |
550 | ||
551 | # | |
552 | # set up the errorLogDirectory name, if necessary; but don't create it until it's needed | |
553 | # | |
554 | ||
555 | if gopts.jsonDir == "": | |
556 | errorsDir = os.path.join(targetDir, "ePubCheckJson") | |
557 | else: | |
558 | errorsDir = gopts.jsonDir | |
559 | ||
560 | if not ppDiffs and not ppJson: | |
561 | targetFileType = ".epub" | |
562 | if gopts.targetFile == "": | |
563 | listing = os.listdir(targetDir) | |
564 | else: | |
565 | # | |
566 | # handle the case where | |
567 | # target file is not specified | |
568 | # where it contains a list of comma separated file names | |
569 | # and where it contains a fq pathname | |
570 | # | |
571 | filePath, fileName = os.path.split(gopts.targetFile) | |
572 | if filePath != "": | |
573 | targetDir = filePath | |
574 | listing = fileName.split(",") | |
575 | else: | |
576 | targetDir = errorsDir | |
577 | if ppJson: | |
578 | targetFileType = ".epubcheck.json" | |
579 | if ppDiffs: | |
580 | targetFileType = ".jsondiffs.json" | |
581 | if gopts.targetFile == "": | |
582 | listing = os.listdir(targetDir) | |
583 | else: | |
584 | filePath, fileName = os.path.split(gopts.targetFile) | |
585 | if filePath != "": | |
586 | targetDir = filePath | |
587 | listing = fileName.split(",") | |
588 | ||
589 | nClean=0 | |
590 | nErrs=0 | |
591 | nChecked = 0 | |
592 | nTotalFiles = len(listing) | |
593 | ||
594 | print ("--\r\n") | |
595 | print ("Target Dir= " + targetDir + " (contains "+ str(nTotalFiles) + " files; looking for files of type '" + targetFileType + "' to examine)") | |
596 | ||
597 | saveJson = gopts.saveJson | |
598 | if saveJson: | |
599 | if not os.path.exists(errorsDir): | |
600 | print ("Dir: " + errorsDir + " does not exist; creating it...") | |
601 | os.mkdir(errorsDir) | |
602 | else: | |
603 | print (".json output files are NOT being saved") | |
604 | print ("--") | |
605 | print ("") | |
606 | ||
607 | ||
608 | ||
609 | for file in listing: | |
610 | if ppJson or ppDiffs: | |
611 | if targetFileType in file.lower(): | |
612 | if ppJson and file.lower().find(targetFileType) == len(file)-len(targetFileType): | |
613 | print(" Pretty print: " + file) | |
614 | jsonFile = os.path.join(targetDir, file) | |
615 | jsonData = open(jsonFile, "r").read() | |
616 | checkResults = json.loads(jsonData) | |
617 | print(" Messages Summary:") | |
618 | logMessages(checkResults["messages"], "FATAL") | |
619 | logMessages(checkResults["messages"], "ERROR") | |
620 | if gopts.verbose or gopts.showWarning: | |
621 | logMessages(checkResults["messages"], "WARNING") | |
622 | if gopts.verbose or gopts.showUsage: | |
623 | logMessages(checkResults["messages"], "USAGE") | |
624 | print ("--") | |
625 | print ("") | |
626 | else: | |
627 | print ("File #" + str(nChecked) + ": " + file + " is not of type '" + targetFileType + "'; skipped...") | |
628 | continue | |
629 | if ppDiffs: | |
630 | print(" Prettyprint: " + file) | |
631 | jsonFile = os.path.join(targetDir, file) | |
632 | jsonData = open(jsonFile, "r").read() | |
633 | jsonDiffs = json.loads(jsonData) | |
634 | ppEpubCheckChanges(jsonDiffs) | |
635 | else: | |
636 | print ("File #" + str(nChecked) + ": " + file + " is not of type '" + targetFileType + "'; skipped...") | |
637 | continue | |
638 | continue | |
639 | ||
640 | startTime = time.time() | |
641 | nChecked += 1 | |
642 | expString = "" | |
643 | if os.path.splitext(file)[-1].lower() != ".epub": | |
644 | if os.path.isdir(os.path.join(targetDir, file)) and os.path.exists(os.path.join(targetDir, file, "mimetype")): | |
645 | print ("File #" + str(nChecked) + ": " + file + " is a directory and mimetype file found; treating it as an expanded ePub...") | |
646 | expString = " --mode exp" | |
647 | if(os.path.exists(os.path.join(targetDir, file + ".epub"))): | |
648 | expString += " --save" | |
649 | print(" ePub: '" + os.path.join(targetDir, file + ".epub") + "' exists; it will be overwritten, or if severe errors exist when checking the expanded ePub directory '" + file + "' it will be deleted.") | |
650 | else: | |
651 | print ("File #" + str(nChecked) + ": " + file + " is not an ePub, skipped...") | |
652 | continue | |
653 | # | |
654 | # Process an ePub | |
655 | # | |
656 | ePub = os.path.join(targetDir, file) | |
657 | print ("File #" + str(nChecked) + " (of " + str(nTotalFiles) + ") : " + ePub +" checking...\r"), | |
658 | ||
659 | tmpOutputFile = os.path.join(targetDir, file + "Check.log") | |
660 | cmdStr = ePubCheckCmd + ' ' + '"' + ePub + '"' + overrideCmdStr | |
661 | #if gopts.jarArgs != "": | |
662 | # | |
663 | # hack to add default json output file name to the -j param until that bug gets fixed in ePubCheck | |
664 | # | |
665 | if not gopts.jsonNamedByEAN: | |
666 | jsonFileName = file + 'Check.json' | |
667 | else: | |
668 | if not file[0:13].isdigit(): | |
669 | jsonFileName = file + 'Check.json' | |
670 | print ('File: "' + file + '" does not conform to the leading-EAN naming scheme; json output file will be named: "' + jsonFileName +'"') | |
671 | else: | |
672 | jsonFileName = file[0:13] + ".ePubCheck.json" | |
673 | jsonFile = os.path.join(targetDir, jsonFileName) | |
674 | cmdStr += ' -u -j "' + jsonFile + '"' + expString | |
675 | ||
676 | # | |
677 | # add the -u flag to epubcheckh if verbose output is on... | |
678 | # | |
679 | if gopts.verbose: cmdStr += " -u" | |
680 | # | |
681 | # add the value of the env var ePubCheckCustomMessageFile if it exists | |
682 | # | |
683 | ||
684 | # | |
685 | # add any more jarArgs to the command line | |
686 | # this is very brittle as the script adds a bunch of things autocratically which could conflict | |
687 | # | |
688 | cmdStr += " " + gopts.jarArgs | |
689 | # print ("ePubCheck cmdstr= " + cmdStr) | |
690 | # | |
691 | # run ePubcheck and present the results stored in the json output file; If the script is running in Python 3.3 or later, use the timeout= | |
692 | # | |
693 | tmpOutput = open(tmpOutputFile, "w") | |
694 | if(useTimeout): | |
695 | try: | |
696 | checkTime = time.time() | |
697 | checkProcess = subprocess.call(cmdStr, stdout=tmpOutput, stderr=tmpOutput, shell=True, timeout=gopts.timeoutVal) | |
698 | checkTime = time.time() - checkTime | |
699 | except subprocess.TimeoutExpired: | |
700 | if logging: | |
701 | elapsedTime = time.time() - startTime | |
702 | checkTime = gopts.timeoutVal | |
703 | logStats(statsLog, targetDir, file, ePub, str(elapsedTime), str(checkTime), "NA", "NA", "NA", "NA", "NA", "NA", "NA", "ePubcheck hang", "NA", "NA", "NA") | |
704 | print ("ePubCheck of " + file + " has hung; skipping this file") | |
705 | continue | |
706 | else: | |
707 | checkTime = time.time() | |
708 | checkProcess = subprocess.call(cmdStr, stdout=tmpOutput, stderr=tmpOutput, shell=True) | |
709 | checkTime = time.time() - checkTime | |
710 | elapsedTime = time.time() - startTime | |
711 | tmpOutput.close() | |
712 | if checkForException(tmpOutputFile, "com.adobe.epubcheck.tool.Checker.main"): | |
713 | print(" Epubcheck threw exception processing file: " + file + "; processing of this file aborting...") | |
714 | logStats(statsLog, "NA", targetDir, file, ePub, str(elapsedTime), str(checkTime), "NA", "NA", "NA", "NA", "NA", "NA", "NA", "ePubcheck exception", "NA", "NA", "NA") | |
715 | continue | |
716 | # | |
717 | # examine the .json output file and check the number of errors reported | |
718 | # | |
719 | if os.path.exists(jsonFile): | |
720 | try: | |
721 | jsonData = open(jsonFile, "r").read() | |
722 | checkResults = json.loads(jsonData) | |
723 | except: | |
724 | if logging: | |
725 | elapsedTime = time.time() - startTime | |
726 | logStats(statsLog, targetDir, file, ePub, str(elapsedTime), str(checkTime), "NA", "NA", "NA", "NA", "NA", "NA", "NA", "exception on .json json.loads", "NA", "NA", "NA") | |
727 | print ("json output file " + jsonFile + " caused exception on load, processing of " + ePub + " abandoned...") | |
728 | continue | |
729 | checkerVersion = checkResults["checker"]["checkerVersion"] | |
730 | nErrs = checkResults["checker"]["nFatal"] + checkResults["checker"]["nError"] | |
731 | if nErrs == 0: | |
732 | print ("File #" + str(nChecked) + " (of " + str(nTotalFiles) + ") : " + ePub + " contained NO severe error messages:") | |
733 | else: | |
734 | print ("File #" + str(nChecked) + " (of " + str(nTotalFiles) + ") : " + ePub + " has " + str(nErrs) + " severe errors (" + str(checkResults["checker"]["nFatal"]) +" FATAL and " + str(checkResults["checker"]["nError"]) + " ERROR messages)") | |
735 | # | |
736 | # custom message file in use? | |
737 | # | |
738 | customMessages = checkResults["customMessageFileName"] | |
739 | if customMessages is not None: | |
740 | print(" Custom message file in use: '" + customMessages +"'") | |
741 | print("") | |
742 | # | |
743 | # failure case, expected .json file not found, keep trying! | |
744 | # | |
745 | else: | |
746 | if logging: | |
747 | elapsedTime = time.time() - startTime | |
748 | logStats(statsLog, "NA", targetDir, file, ePub, str(elapsedTime), str(checkTime), "NA", "NA", "NA", "NA", "na", "NA", "NA", ".json output FNF", "NA", "NA", "NA") | |
749 | print ("json output file " + jsonFile + " NOT found, processing aborting...") | |
750 | continue | |
751 | # | |
752 | # preserve the json if requested; [TODO] only diff them if saving and an older version was foun | |
753 | # | |
754 | # simple case, no prior json file exists, save it away and move on | |
755 | # | |
756 | if not os.path.exists(os.path.join(errorsDir, jsonFileName)): | |
757 | if saveJson: | |
758 | shutil.move(jsonFile, errorsDir) | |
759 | oldPath = "NA" | |
760 | checkChanges = "NA" | |
761 | pubChanged = "NA" | |
762 | spineChanged = "NA" | |
763 | manifestChanged = "NA" | |
764 | messagesChanged = "NA" | |
765 | oldJsonExists = False | |
766 | else: | |
767 | # | |
768 | # existing json file for this epub was found; load the old json to see if its identical; if it is, save the latest copy | |
769 | # | |
770 | oldJsonExists = True | |
771 | if gopts.compareJson: | |
772 | jsonData = open(os.path.join(errorsDir, jsonFileName), "r").read() | |
773 | oldResults = json.loads(jsonData) | |
774 | jsonChanges = CompareResults.compareResults(oldResults, checkResults) | |
775 | changesJsonFile = jsonFileName + "diffs.json" | |
776 | changesJson = os.path.join(errorsDir, changesJsonFile) | |
777 | if os.path.exists(changesJson): | |
778 | oldMTime = os.path.getmtime(changesJson) | |
779 | oldFMTtime = time.strftime("%Y-%m-%d-%H%M.%S", time.gmtime(oldMTime)) | |
780 | newName = jsonFileName + "diffs." + oldFMTtime +".json" | |
781 | print (" Found an older json diff output file for " + ePub + "; saving the older version as: '" + os.path.join(errorsDir, newName) + "'\r\n") | |
782 | os.rename(changesJson, os.path.join(errorsDir, newName)) | |
783 | changesFP = open(changesJson, "w") | |
784 | json.dump(jsonChanges, changesFP, indent=2) | |
785 | changesFP.close() | |
786 | ||
787 | # | |
788 | # json schema change: moved file name from publication to checker; if the name isn't found, use one in publication; this hack can be removed soon. | |
789 | # | |
790 | try: | |
791 | oldPath = oldResults["checker"]["path"] | |
792 | except: | |
793 | oldPath = oldResults["publication"]["path"] | |
794 | #if (oldResults["publication"] == checkResults["publication"]) and (oldResults["items"] == checkResults["items"]) and (oldResults["messages"] == checkResults["messages"]): | |
795 | if jsonChanges["summary"]["publicationChanges"] + jsonChanges["summary"]["itemChanges"] == 0: | |
796 | print (' ePubCheck done on "' + checkResults["checker"]["checkDate"] + '" matched results of check done of "' + str(oldPath) + '" on "' + oldResults["checker"]["checkDate"] + '"; preserving the latest output file...') | |
797 | # | |
798 | # save the latest version | |
799 | # | |
800 | if saveJson: | |
801 | os.remove(os.path.join(errorsDir, jsonFileName)) | |
802 | shutil.move(jsonFile, errorsDir) | |
803 | else: | |
804 | os.remove(jsonFile) | |
805 | pubChanged = False | |
806 | spineChanged = False | |
807 | manifestChanged = False | |
808 | checkChanges = False | |
809 | if oldResults["messages"] == checkResults["messages"]: | |
810 | messagesChanged = False | |
811 | ||
812 | # | |
813 | # the two json files aren't identical; preserve the old one by adding a timestamp to the file name and save the new one as "file".epubCheck.json | |
814 | # | |
815 | else: | |
816 | checkChanges = True | |
817 | if saveJson: | |
818 | oldMTime = os.path.getmtime(os.path.join(errorsDir, jsonFileName)) | |
819 | oldFMTtime = time.strftime("%Y-%m-%d-%H%M.%S", time.gmtime(oldMTime)) | |
820 | newName = jsonFileName + "." + oldFMTtime +".json" | |
821 | print (" Found an older, differing, json output file for " + ePub + "; saving the older version as: '" + os.path.join(errorsDir, newName) + "'\r\n") | |
822 | os.rename(os.path.join(errorsDir, jsonFileName), os.path.join(errorsDir, newName)) | |
823 | shutil.move(jsonFile, errorsDir) | |
824 | else: | |
825 | os.remove(jsonFile) | |
826 | ||
827 | print ("") | |
828 | ppEpubCheckChanges(jsonChanges) | |
829 | pubChanged = jsonChanges["summary"]["encodedPubChanges"] | |
830 | spineChanged = jsonChanges["summary"]["encodedSpineChanges"] | |
831 | manifestChanged = jsonChanges["summary"]["encodedManiChanges"] | |
832 | ||
833 | # | |
834 | # log output to console | |
835 | # | |
836 | if nErrs > 0 and gopts.showErrors: | |
837 | print ("") | |
838 | print (" Messages Summary:") | |
839 | logMessages(checkResults["messages"], "FATAL") | |
840 | logMessages(checkResults["messages"], "ERROR") | |
841 | if gopts.verbose or gopts.showWarning: | |
842 | logMessages(checkResults["messages"], "WARNING") | |
843 | if gopts.verbose or gopts.showUsage: | |
844 | logMessages(checkResults["messages"], "USAGE") | |
845 | if oldJsonExists and gopts.compareJson: | |
846 | if checkResults["messages"] != oldResults["messages"] and (gopts.verbose): | |
847 | messagesChanged = compareMessages(checkResults["messages"], oldResults["messages"]) | |
848 | if checkResults["messages"] != oldResults["messages"] and not gopts.verbose: | |
849 | messagesChanged = "True, comparison disabled" | |
850 | print (" Generated messages from " + file + " changed...") | |
851 | print (" --") | |
852 | if checkResults["messages"] == oldResults["messages"] and gopts.verbose: | |
853 | messagesChanged = "Mess==" | |
854 | print (" Generated messages from " + file + " are unchanged...") | |
855 | print (" --") | |
856 | else: | |
857 | messagesChanged = "NA, comparison disabled" | |
858 | print ("--") | |
859 | print ("") | |
860 | # | |
861 | # save the results and cleanup after each file | |
862 | # | |
863 | elapsedTime = time.time() - startTime | |
864 | if logging: logStats(statsLog, checkerVersion, targetDir, file, ePub, str(elapsedTime), str(checkTime), str(checkResults["publication"]["ePubVersion"]), str(oldPath), str(checkChanges), pubChanged, spineChanged, manifestChanged, str(messagesChanged), str(checkResults["checker"]["nFatal"]), str(checkResults["checker"]["nError"]), str(checkResults["publication"]["isScripted"]), str(checkResults["publication"]["hasFixedFormat"])) | |
865 | os.remove(tmpOutputFile) |
0 | #!c:\python27\python | |
1 | import os | |
2 | import sys | |
3 | import optparse | |
4 | import json | |
5 | import Dictionary | |
6 | ||
7 | def compareResults(oldJson, newJson): | |
8 | # | |
9 | # initialize the differences structure and local copies of the publication metadata and the manifest for comparison | |
10 | # | |
11 | jsonDelta = {} | |
12 | jsonDelta["summary"] = {} | |
13 | jsonDelta["summary"]["newCheck"] = newJson["checker"] | |
14 | jsonDelta["summary"]["oldCheck"] = oldJson["checker"] | |
15 | jsonDelta["publication"] = {} | |
16 | jsonDelta["publication"]["adds"] = {} | |
17 | jsonDelta["publication"]["cuts"] = {} | |
18 | jsonDelta["publication"]["changes"] = {} | |
19 | jsonDelta["manifest"] = {} | |
20 | jsonDelta["manifest"]["adds"] = {} | |
21 | jsonDelta["manifest"]["cuts"] = {} | |
22 | jsonDelta["manifest"]["changes"] = {} | |
23 | jsonDelta["spine"] = {} | |
24 | jsonDelta["spine"]["adds"] = {} | |
25 | jsonDelta["spine"]["cuts"] = {} | |
26 | jsonDelta["spine"]["orderChanges"] = {} | |
27 | jsonDelta["spine"]["contentChanges"] = [] | |
28 | jsonDelta["spine"]["unchanged"] = 0 | |
29 | oldPub = oldJson["publication"] | |
30 | newPub = newJson["publication"] | |
31 | newSpine = {} | |
32 | newSpineContents = {} | |
33 | oldSpineContents = {} | |
34 | oldSpine = {} | |
35 | newMani = newJson["items"] | |
36 | oldMani = oldJson["items"] | |
37 | ||
38 | newManiDict = Dictionary.makeDict(newMani, "id") | |
39 | oldManiDict = Dictionary.makeDict(oldMani, "id") | |
40 | # | |
41 | # compare the publication metadata | |
42 | # | |
43 | deltaPub = Dictionary.DictCompare(newPub, oldPub) | |
44 | for item in deltaPub.added(): | |
45 | jsonDelta["publication"]["adds"][item] = newPub[item] | |
46 | for item in deltaPub.removed(): | |
47 | jsonDelta["publication"]["cuts"][item] = oldPub[item] | |
48 | pubChanges = deltaPub.changed() | |
49 | if len(pubChanges) > 0: | |
50 | for item in pubChanges: | |
51 | jsonDelta["publication"]["changes"][item] = {"newValue" : newPub[item], "oldValue" : oldPub[item]} | |
52 | if len(jsonDelta["publication"]["adds"]) + len(jsonDelta["publication"]["cuts"]) + len(jsonDelta["publication"]["changes"]) == 0: | |
53 | pubChanges = "Pub==" | |
54 | else: | |
55 | pubChanges = "Pub-" | |
56 | if len(jsonDelta["publication"]["adds"]) > 0: | |
57 | pubChanges += "A" + str(len(jsonDelta["publication"]["adds"])) | |
58 | else: | |
59 | pubChanges += "xA" | |
60 | if len(jsonDelta["publication"]["cuts"]) > 0: | |
61 | pubChanges += "-R" + str(len(jsonDelta["publication"]["cuts"])) | |
62 | else: | |
63 | pubChanges += "-xR" | |
64 | if len(jsonDelta["publication"]["changes"]) > 0: | |
65 | pubChanges += "-C" + str(len(jsonDelta["publication"]["changes"])) | |
66 | else: | |
67 | pubChanges += "-xC" | |
68 | jsonDelta["summary"]["publicationChanges"] = len(jsonDelta["publication"]["adds"]) + len(jsonDelta["publication"]["cuts"]) + len(jsonDelta["publication"]["changes"]) | |
69 | jsonDelta["summary"]["encodedPubChanges"] = pubChanges | |
70 | # print(" Calculated publication changes: adds: " + str(len(jsonDelta["publication"]["adds"])) + "; cuts: " + str(len(jsonDelta["publication"]["cuts"])) + "; changes: " + str(len(jsonDelta["publication"]["changes"]))) | |
71 | # | |
72 | # compare the spines | |
73 | # | |
74 | for item in newMani: | |
75 | if item["isSpineItem"]: | |
76 | newSpine[item["id"]] = item["spineIndex"] | |
77 | newSpineContents[item["id"]] = item["checkSum"] | |
78 | ||
79 | for item in oldMani: | |
80 | if item["isSpineItem"]: | |
81 | oldSpine[item["id"]] = item["spineIndex"] | |
82 | oldSpineContents[item["id"]] = item["checkSum"] | |
83 | ||
84 | ||
85 | deltaSpine = Dictionary.DictCompare(newSpine, oldSpine) | |
86 | deltaSpineContents = Dictionary.DictCompare(newSpineContents, oldSpineContents) | |
87 | # | |
88 | # look for adds/cuts/order changes in the spine | |
89 | # | |
90 | if len(deltaSpine.added()) > 0: | |
91 | for item in deltaSpine.added(): | |
92 | jsonDelta["spine"]["adds"][item] = newSpine[item] | |
93 | if len(deltaSpine.removed()) > 0: | |
94 | for item in deltaSpine.removed(): | |
95 | jsonDelta["spine"]["cuts"][item] = oldSpine[item] | |
96 | if len(deltaSpine.changed()) > 0: | |
97 | jsonDelta["spine"]["orderChanges"] = {} | |
98 | for item in deltaSpine.changed(): | |
99 | jsonDelta["spine"]["orderChanges"][item] = {} | |
100 | jsonDelta["spine"]["orderChanges"][item]["newSpineIndex"] = newSpine[item] | |
101 | jsonDelta["spine"]["orderChanges"][item]["oldSpineIndex"] = oldSpine[item] | |
102 | jsonDelta["spine"]["unchanged"] = len(deltaSpine.unchanged()) | |
103 | # | |
104 | # look for content changes only in items common to both spines | |
105 | # | |
106 | if len(deltaSpineContents.changed()) > 0: | |
107 | for item in deltaSpineContents.changed(): | |
108 | jsonDelta["spine"]["contentChanges"] = {} | |
109 | jsonDelta["spine"]["contentChanges"][item] = newManiDict[item]["fileName"] | |
110 | # | |
111 | # summarize the spine changes | |
112 | # | |
113 | if len(jsonDelta["spine"]["adds"]) + len(jsonDelta["spine"]["cuts"]) + len(jsonDelta["spine"]["orderChanges"]) + len(jsonDelta["spine"]["contentChanges"]) == 0: | |
114 | spineChanges = "Spine==-N" + str(newPub["nSpines"]) | |
115 | else: | |
116 | spineChanges = "Spine-N" + str(newPub["nSpines"]) | |
117 | if jsonDelta["spine"]["unchanged"] == 0: | |
118 | spineChanges += "-xU" | |
119 | else: | |
120 | spineChanges += "-U" + str(jsonDelta["spine"]["unchanged"]) | |
121 | if len(jsonDelta["spine"]["adds"]) > 0: | |
122 | spineChanges += "-A" + str(len(jsonDelta["spine"]["adds"])) | |
123 | else: | |
124 | spineChanges += "-xA" | |
125 | if len(jsonDelta["spine"]["cuts"]) > 0: | |
126 | spineChanges += "-R" + str(len(jsonDelta["spine"]["cuts"])) | |
127 | else: | |
128 | spineChanges += "-xR" | |
129 | if len(jsonDelta["spine"]["orderChanges"]) > 0: | |
130 | spineChanges += "-OC" + str(len(jsonDelta["spine"]["orderChanges"])) | |
131 | else: | |
132 | spineChanges += "-xOC" | |
133 | if len(jsonDelta["spine"]["contentChanges"]) > 0: | |
134 | spineChanges += "-CC" + str(len(jsonDelta["spine"]["contentChanges"])) | |
135 | else: | |
136 | spineChanges += "-xCC" | |
137 | ||
138 | jsonDelta["summary"]["spineChanges"] = len(jsonDelta["spine"]["adds"]) + len(jsonDelta["spine"]["cuts"]) + len(jsonDelta["spine"]["orderChanges"]) + len(jsonDelta["spine"]["contentChanges"]) | |
139 | jsonDelta["summary"]["encodedSpineChanges"] = spineChanges | |
140 | # | |
141 | # do the manifest | |
142 | # | |
143 | deltaMani = Dictionary.DictCompare(newManiDict, oldManiDict) | |
144 | for item in deltaMani.added(): | |
145 | jsonDelta["manifest"]["adds"][item] = newManiDict[item] | |
146 | for item in deltaMani.removed(): | |
147 | jsonDelta["manifest"]["cuts"][item] = oldManiDict[item] | |
148 | maniChanges = list(deltaMani.changed()) | |
149 | if (len(maniChanges)) > 0: | |
150 | for item in maniChanges: | |
151 | itemDelta = {} | |
152 | deltaItem = Dictionary.DictCompare(newManiDict[item], oldManiDict[item]) | |
153 | if len(deltaItem.added()) > 0: | |
154 | itemDelta["adds"] = {} | |
155 | for theAdd in deltaItem.added(): | |
156 | #print("property added: " + theAdd + " value: " + str(newManiDict[item][theAdd])) | |
157 | itemDelta["adds"][theAdd] = newManiDict[item][theAdd] | |
158 | if len(deltaItem.removed()) > 0: | |
159 | itemDelta["cuts"] = {} | |
160 | for theCut in deltaItem.removed(): | |
161 | itemDelta["cuts"][theCut] = oldManiDict[item][theCut] | |
162 | if len(deltaItem.changed()) > 0: | |
163 | itemDelta["changes"] = {} | |
164 | for itemProp in deltaItem.changed(): | |
165 | if itemProp == "referencedItems": | |
166 | oldRefs = set(oldManiDict[item][itemProp]) | |
167 | newRefs = set(newManiDict[item][itemProp]) | |
168 | refsCommon = newRefs & oldRefs | |
169 | refsAdded = newRefs - oldRefs | |
170 | refsRemoved = oldRefs - refsCommon | |
171 | itemDelta["changes"][itemProp] = { "adds" : list(refsAdded), "cuts" : list(refsRemoved) } | |
172 | elif itemProp == "checkSum": | |
173 | itemDelta["changes"][itemProp] = {"newValue" : newManiDict[item]["fileName"], "oldValue" : oldManiDict[item]["fileName"]} | |
174 | else: | |
175 | itemDelta["changes"][itemProp] = {"newValue" : newManiDict[item][itemProp], "oldValue" : oldManiDict[item][itemProp]} | |
176 | jsonDelta["manifest"]["changes"][item] = itemDelta | |
177 | #print(" Manifest item id : " + item + " changed...") | |
178 | #print(" Manifest item delta: " + str(itemDelta)) | |
179 | ||
180 | # print(" Calculated manifest changes: adds: " + str(len(jsonDelta["manifest"]["adds"])) + "; cuts: " + str(len(jsonDelta["manifest"]["cuts"])) + "; changes: " + str(len(jsonDelta["manifest"]["changes"]))) | |
181 | jsonDelta["summary"]["itemChanges"] = len(jsonDelta["manifest"]["adds"]) + len(jsonDelta["manifest"]["cuts"]) + len(jsonDelta["manifest"]["changes"]) | |
182 | if jsonDelta["summary"]["itemChanges"] == 0: | |
183 | jsonDelta["summary"]["encodedManiChanges"] = "Mani==" | |
184 | else: | |
185 | maniChanges = "Mani-" | |
186 | if len(jsonDelta["manifest"]["adds"]) > 0: | |
187 | maniChanges += "A" + str(len(jsonDelta["manifest"]["adds"])) | |
188 | else: | |
189 | maniChanges += "xA" | |
190 | if len(jsonDelta["manifest"]["cuts"]) > 0: | |
191 | maniChanges += "-R" + str(len(jsonDelta["manifest"]["cuts"])) | |
192 | else: | |
193 | maniChanges += "-xR" | |
194 | if len(jsonDelta["manifest"]["changes"]) > 0: | |
195 | maniChanges += "-C" + str(len(jsonDelta["manifest"]["changes"])) | |
196 | else: | |
197 | maniChanges += "-xC" | |
198 | jsonDelta["summary"]["encodedManiChanges"] = maniChanges | |
199 | return(jsonDelta)⏎ |
0 | #!c:\python27\python | |
1 | import os | |
2 | import sys | |
3 | import optparse | |
4 | import json | |
5 | import CompareResults | |
6 | ||
7 | def parse_compare_args(argv): | |
8 | prog_dir = os.path.dirname(argv[0]) | |
9 | usage = """ | |
10 | Usage: %s <originalFile> <newFile> [OPTION] | |
11 | Compare epubcheck json output | |
12 | """[1:-1] % os.path.basename(argv[0]) | |
13 | ||
14 | parser = optparse.OptionParser(usage=usage) | |
15 | parser.add_option("-r", "--resultFile", dest="resultFile", type="str", default="", | |
16 | help="The destination file for the comparison results") | |
17 | ||
18 | opts,args = parser.parse_args(argv[1:]) | |
19 | return opts,args | |
20 | ||
21 | global gopts | |
22 | gopts,args = parse_compare_args(sys.argv) | |
23 | if (len(sys.argv) < 3): | |
24 | print "ERROR: Insufficent args were provided. Specify both the original and new json files" | |
25 | sys.exit(); | |
26 | gopts.oldFile = os.path.join(".", sys.argv[1]) | |
27 | gopts.newFile = os.path.join(".", sys.argv[2]) | |
28 | if (not os.path.isfile(gopts.oldFile)): | |
29 | print "ERROR: Original file " + gopts.oldFile + " could not be found" | |
30 | sys.exit() | |
31 | if (not os.path.isfile(gopts.newFile)): | |
32 | print "ERROR: New file " + gopts.newFile + " could not be found" | |
33 | sys.exit() | |
34 | oldJsonData = open(gopts.oldFile, "r").read() | |
35 | oldResults = json.loads(oldJsonData) | |
36 | newJsonData = open(gopts.newFile, "r").read() | |
37 | newResults = json.loads(newJsonData) | |
38 | jsonChanges = CompareResults.compareResults(oldResults, newResults) | |
39 | if (gopts.resultFile == ""): | |
40 | gopts.resultFile = os.path.basename(gopts.oldFile) + "_" + os.path.basename(gopts.newFile) + ".json" | |
41 | changesFP = open(gopts.resultFile, "w") | |
42 | json.dump(jsonChanges, changesFP, indent=2) | |
43 | changesFP.close() | |
44 | print "Finished! Results can be found in " + gopts.resultFile⏎ |
0 | #!c:\python27\python | |
1 | ||
2 | import os | |
3 | import sys | |
4 | import datetime | |
5 | import time | |
6 | import webbrowser | |
7 | import urllib | |
8 | import optparse | |
9 | import subprocess | |
10 | import zipfile | |
11 | import tempfile | |
12 | import shutil | |
13 | import glob | |
14 | ||
15 | def parse_args(argv): | |
16 | prog_dir = os.path.dirname(argv[0]) | |
17 | usage = """ | |
18 | Usage: %s [OPTION] | |
19 | Collect ePubStats on the ePub files in the target directory | |
20 | """[1:-1] % os.path.basename(argv[0]) | |
21 | ||
22 | parser = optparse.OptionParser(usage=usage) | |
23 | parser.add_option("-n", "--newerDir", dest="newerDir", type="str", | |
24 | help="Directory holding the latest versions of the ePub versions to compare") | |
25 | parser.add_option("-o", "--olderDir", dest="olderDir", type="str", | |
26 | help="Directory holding the older version of the ePubs being compared") | |
27 | parser.add_option("-p", "--preserveDiffs", action="store_true", dest="saveDiffs", default=False, | |
28 | help=r"Use this flag to save the differences file in the --diffLogDir directory") | |
29 | parser.add_option("--diffLogDir", dest="diffsDir", type="str", default=r"diffLogs", | |
30 | help=r"If --preserveDiffs is specified, the diffs are stored in the directory 'diffLogs' unless overriden by the value of this option.") | |
31 | parser.add_option("--logdir", dest="logdir", type="str", | |
32 | default=r"$EPUBCHECK-LOGS", | |
33 | help=r"Log file location used by this tool, default=%EPUBCHECK-LOGS%\CompareLog.TabDelimitedFile; if EPUBCHECK-LOGS is not defined, the file will be written to the current working directory") | |
34 | parser.add_option("--logfile", dest="logfile", type="str", | |
35 | default=r"CompareEpubsLog.TabDelimitedFile", | |
36 | help=r"Log file name used by this tool, default=CompareEpubsLog.TabDelimitedFile") | |
37 | ||
38 | opts,args = parser.parse_args(argv[1:]) | |
39 | return opts,args | |
40 | ||
41 | ||
42 | def logStats(log_file, elapsedTime, newerFile, olderFile, zipsDiffer, nDiffs): | |
43 | ||
44 | now = datetime.datetime.now() | |
45 | dateTime = str(now.date()) + "\t" + str(now.time()) | |
46 | ||
47 | if not os.path.exists(log_file): | |
48 | print "File " + log_file + " doesn't exist, inititalizing file..." | |
49 | f = open(log_file, 'a') | |
50 | f.write("logDate\t") | |
51 | f.write("logTime\t") | |
52 | f.write("elapsedTime\t") | |
53 | f.write("newerFile\t") | |
54 | f.write("olderFile\t") | |
55 | f.write("zipsDiffer\t") | |
56 | f.write("nDiffs\n") | |
57 | else: | |
58 | f = open(log_file, 'a') | |
59 | ||
60 | f.write(str(now.date()) + "\t" + str(now.time()) + "\t") | |
61 | f.write(elapsedTime + "\t") | |
62 | f.write(newerFile.rstrip("\r") + "\t") | |
63 | f.write(olderFile.rstrip("\r") + "\t") | |
64 | f.write(zipsDiffer + "\t") | |
65 | f.write(nDiffs + "\n") | |
66 | f.close() | |
67 | ||
68 | return "ePubStats Logging complete..." | |
69 | ||
70 | global gopts | |
71 | gopts,args = parse_args(sys.argv) | |
72 | ||
73 | # | |
74 | # verify that the log dir and file are writable | |
75 | # | |
76 | if not os.path.isdir(os.path.expandvars(gopts.logdir)): | |
77 | print ('Activity logging environment variable "' + gopts.logdir + '" is not a valid dir, logging to the current working directory') | |
78 | statsLog = os.path.join(".", gopts.logfile) | |
79 | else: | |
80 | statsLog = os.path.join(os.path.expandvars(gopts.logdir), gopts.logfile) | |
81 | print ("Activity logging is being performed to " + statsLog) | |
82 | ||
83 | olderDir = gopts.olderDir | |
84 | newerDir = gopts.newerDir | |
85 | ||
86 | print("B&N CompareEpubVersions Tool") | |
87 | print(" Compare two directory trees holding differing versions of the same ePubs") | |
88 | print("--\n") | |
89 | print('Comparing the newer ePub versions in "' + newerDir + '" to older versions in "' + olderDir +'"') | |
90 | print("--\n") | |
91 | ||
92 | newerFiles = os.listdir(newerDir) | |
93 | olderFiles = os.listdir(olderDir) | |
94 | newCount = len(newerFiles) | |
95 | ||
96 | if newCount != len(olderFiles): | |
97 | print(' Warning: The number of files in "' + newerDir + '" (' + str(len(newerFiles)) + ') does not match the number in "' + olderDir + '" (' + str(len(olderFiles)) + ')') | |
98 | ||
99 | diffCmd = "diff -q -w -r " | |
100 | nCompared = 0 | |
101 | nChecked = 0 | |
102 | ||
103 | for file in newerFiles: | |
104 | startTime = time.time() | |
105 | nChecked += 1 | |
106 | if os.path.splitext(file)[-1].lower() != ".epub": | |
107 | print (" File #" + str(nChecked) + " (of " + str(newCount) + ") " + str(nChecked) + ": " + file + " is not an ePub, skipped...\n--\n") | |
108 | continue | |
109 | ||
110 | nCompared += 1 | |
111 | ||
112 | ean = file[0:13] | |
113 | olderTarget = glob.glob(os.path.join(olderDir, ean + "*.epub")) | |
114 | if len(olderTarget) == 0: | |
115 | print(" Error: no older version of " + os.path.join(newerDir, file) + " was found in " + olderDir) | |
116 | logStats(statsLog, "NA", os.path.join(newerDir, file), olderDir, str(False), "Older File Not Found") | |
117 | nCompared -= 1 | |
118 | continue | |
119 | ||
120 | if len(olderTarget) > 1: | |
121 | print(" Warning: " + str(len(olderTarget)) + " ePubs found matching the target ean: " + ean) | |
122 | # print("File: " + file + " ean: " + ean + " olderTarget: " + str(olderTarget)) | |
123 | ||
124 | newTmpBookDir = tempfile.mkdtemp(prefix="CePubV-Newer") | |
125 | oldTmpBookDir = tempfile.mkdtemp(prefix="CePubV-Older") | |
126 | ||
127 | newZipFile = os.path.join(newerDir, file) | |
128 | oldZipFile = olderTarget[0] | |
129 | ||
130 | tmpOutputFile = os.path.join(newerDir, ean + ".compare.log") | |
131 | tmpOutput = open(tmpOutputFile, mode="w") | |
132 | cmdStr = diffCmd + '"' + newZipFile + '" "' + oldZipFile + '"' | |
133 | print(" Comparing the zips themselves: " + cmdStr) | |
134 | subprocess.call(cmdStr, stdout=tmpOutput, stderr=tmpOutput, shell=True) | |
135 | tmpOutput.close() | |
136 | tmpOutput = open(tmpOutputFile, mode="r") | |
137 | theDiffs = tmpOutput.readlines() | |
138 | tmpOutput.close() | |
139 | if len(theDiffs) == 0: | |
140 | zipsDiffer = False | |
141 | nDiffs = 0 | |
142 | print(" File #" + str(nChecked) + " (of " + str(newCount) + ") (" + str(nCompared) + " compared) - " + file + ": both ePub container files are identical") | |
143 | else: | |
144 | zipsDiffer = True | |
145 | print(" File #" + str(nChecked) + " (of " + str(newCount) + ") (" + str(nCompared) + " compared) - " + file + ": ePub container files differ") | |
146 | for line in theDiffs: | |
147 | print(" Zip Diff: " + line.rstrip()) | |
148 | # | |
149 | # if they are different, lets see how different | |
150 | # | |
151 | if zipsDiffer: | |
152 | newZip = zipfile.ZipFile(newZipFile, 'r') | |
153 | oldZip = zipfile.ZipFile(oldZipFile, 'r') | |
154 | ||
155 | newZip.extractall(newTmpBookDir) | |
156 | oldZip.extractall(oldTmpBookDir) | |
157 | ||
158 | #print(" New zip: " + file + " extracted to: " + str(newTmpBookDir)) | |
159 | #print(" Old zip: " + olderTarget[0] + " extracted to: " + str(oldTmpBookDir)) | |
160 | ||
161 | tmpOutputFile = os.path.join(newerDir, ean + ".compare.log") | |
162 | tmpOutput = open(tmpOutputFile, mode="w") | |
163 | cmdStr = diffCmd + '"' + newTmpBookDir + '" "' + oldTmpBookDir + '"' | |
164 | subprocess.call(cmdStr, stdout=tmpOutput, stderr=tmpOutput, shell=True) | |
165 | tmpOutput.close() | |
166 | tmpOutput = open(tmpOutputFile, mode="r") | |
167 | theDiffs = tmpOutput.readlines() | |
168 | tmpOutput.close() | |
169 | nDiffs = len(theDiffs) | |
170 | if nDiffs > 0: | |
171 | print(" File #" + str(nChecked) + " (of " + str(newCount) + ") (" + str(nCompared) + " compared) - " + file + " have " + str(len(theDiffs)) + " differences:") | |
172 | for line in theDiffs: | |
173 | print(" Asset Diff: " + line.rstrip()) | |
174 | else: | |
175 | print(" File #" + str(nChecked) + " (of " + str(newCount) + ") (" + str(nCompared) + " compared) - " + file + ": ePub contents in differing containers are identical") | |
176 | ||
177 | shutil.rmtree(newTmpBookDir) | |
178 | shutil.rmtree(oldTmpBookDir) | |
179 | ||
180 | print("--") | |
181 | print("") | |
182 | ||
183 | if gopts.saveDiffs: | |
184 | diffsDir = os.path.join(newerDir, gopts.diffsDir) | |
185 | if not os.path.exists(diffsDir): | |
186 | print ("Dir: " + diffsDir + " does not exist; creating it...") | |
187 | os.mkdir(diffsDir) | |
188 | shutil.move(tmpOutputFile, diffsDir) | |
189 | else: | |
190 | os.remove(tmpOutputFile) | |
191 | ||
192 | elapsedTime = str(time.time() - startTime) | |
193 | logStats(statsLog, elapsedTime, os.path.join(newerDir, file), olderTarget[0], str(zipsDiffer), str(nDiffs))⏎ |
0 | #!c:\python27\python | |
1 | ||
2 | class DictCompare(object): | |
3 | """ | |
4 | Calculate the difference between two dictionaries as: | |
5 | (1) items added | |
6 | (2) items removed | |
7 | (3) keys same in both but changed values | |
8 | (4) keys same in both and unchanged values | |
9 | """ | |
10 | def __init__(self, current_dict, past_dict): | |
11 | self.current_dict, self.past_dict = current_dict, past_dict | |
12 | self.current_keys, self.past_keys = [ | |
13 | set(d.keys()) for d in (current_dict, past_dict) | |
14 | ] | |
15 | self.intersect = self.current_keys.intersection(self.past_keys) | |
16 | ||
17 | def added(self): | |
18 | return self.current_keys - self.intersect | |
19 | ||
20 | def removed(self): | |
21 | return self.past_keys - self.intersect | |
22 | ||
23 | def changed(self): | |
24 | return set(o for o in self.intersect | |
25 | if self.past_dict[o] != self.current_dict[o]) | |
26 | ||
27 | def unchanged(self): | |
28 | return set(o for o in self.intersect | |
29 | if self.past_dict[o] == self.current_dict[o]) | |
30 | ||
31 | def makeDict(inputArray, key): | |
32 | outputDict = {} | |
33 | if len(inputArray) == 0: | |
34 | return outputDict | |
35 | for item in inputArray: | |
36 | itemKey = item[key] | |
37 | outputDict[itemKey] = item | |
38 | return outputDict |