зеркало из https://github.com/mozilla/gecko-dev.git
Backed out 4 changesets (c0e8f2c0465f::608c663f691f) (bug 928195) for landing prematurely
--HG-- extra : rebase_source : fa42534ef50a0373738349f17b2ca57510bdd6ac
This commit is contained in:
Родитель
f32c80cc29
Коммит
d39381a161
3
CLOBBER
3
CLOBBER
|
@ -18,5 +18,4 @@
|
|||
# Modifying this file will now automatically clobber the buildbot machines \o/
|
||||
#
|
||||
|
||||
Bug 928195 rewrote WebIDL building from the ground up, hopefully eliminating
|
||||
needs for future clobbers
|
||||
Bug 938950 needs a clobber.
|
||||
|
|
|
@ -28,7 +28,6 @@ Important Concepts
|
|||
mozinfo
|
||||
preprocessor
|
||||
jar-manifests
|
||||
webidl
|
||||
|
||||
mozbuild
|
||||
========
|
||||
|
|
|
@ -1,137 +0,0 @@
|
|||
.. _webidl:
|
||||
|
||||
======
|
||||
WebIDL
|
||||
======
|
||||
|
||||
WebIDL describes interfaces web browsers are supposed to implement.
|
||||
|
||||
The interaction between WebIDL and the build system is somewhat complex.
|
||||
This document will attempt to explain how it all works.
|
||||
|
||||
Overview
|
||||
========
|
||||
|
||||
``.webidl`` files throughout the tree define interfaces the browser
|
||||
implements. Since Gecko/Firefox is implemented in C++, there is a
|
||||
mechanism to convert these interfaces and associated metadata to
|
||||
C++ code. That's where the build system comes into play.
|
||||
|
||||
All the code for interacting with ``.webidl`` files lives under
|
||||
``dom/bindings``. There is code in the build system to deal with
|
||||
WebIDLs explicitly.
|
||||
|
||||
WebIDL source file flavors
|
||||
==========================
|
||||
|
||||
Not all ``.webidl`` files are created equal! There are several flavors,
|
||||
each represented by a separate symbol from :ref:`mozbuild_symbols`.
|
||||
|
||||
WEBIDL_FILES
|
||||
Refers to regular/static ``.webidl`` files. Most WebIDL interfaces
|
||||
are defined this way.
|
||||
|
||||
GENERATED_EVENTS_WEBIDL_FILES
|
||||
In addition to generating a binding, these ``.webidl`` files also
|
||||
generate an event source file.
|
||||
|
||||
PREPROCESSED_WEBIDL_FILES
|
||||
The ``.webidl`` files are generated by preprocessing an input file.
|
||||
They otherwise behave like **WEBIDL_FILES**.
|
||||
|
||||
TEST_WEBIDL_FILES
|
||||
Like **WEBIDL_FILES** but the interfaces are for testing only and
|
||||
aren't shipped with the browser.
|
||||
|
||||
PREPROCESSED_TEST_WEBIDL_FILES
|
||||
Like **TEST_WEBIDL_FILES** except the ``.webidl`` is obtained via
|
||||
preprocessing, much like **PREPROCESSED_WEBIDL_FILES**.
|
||||
|
||||
GENERATED_WEBIDL_FILES
|
||||
The ``.webidl`` for these is obtained through an *external*
|
||||
mechanism. Typically there are custom build rules for producing these
|
||||
files.
|
||||
|
||||
Producing C++ code
|
||||
==================
|
||||
|
||||
The most complicated part about WebIDLs is the process by which
|
||||
``.webidl`` files are converted into C++.
|
||||
|
||||
The process begins by staging every ``.webidl`` file to a common
|
||||
location. For static files, this involves symlinking. However,
|
||||
preprocessed and externally-generated ``.webidl`` have special actions.
|
||||
|
||||
Producing C++ code from ``.webidl`` consists of 3 logical steps:
|
||||
parsing, global generation, and bindings generation.
|
||||
|
||||
Parsing
|
||||
-------
|
||||
|
||||
*Every* ``.webidl`` is fed into a single parser instance. When a single
|
||||
``.webidl`` file changes, *every* ``.webidl`` needs to be reparsed.
|
||||
|
||||
Global Generation
|
||||
-----------------
|
||||
|
||||
Global generation takes the parser output and produces some
|
||||
well-defined output files. These output files essentially depend on
|
||||
every input ``.webidl``.
|
||||
|
||||
Binding Generation
|
||||
------------------
|
||||
|
||||
Binding generation refers to the process of generating output files
|
||||
corresponding to a particular ``.webidl`` file. For all ``.webidl`` files,
|
||||
we generate a ``*Binding.h`` and ``*Binding.cpp`` file. For generated
|
||||
events ``.webidl`` files, we also generate ``*.h`` and ``*.cpp`` files.
|
||||
|
||||
Requirements
|
||||
============
|
||||
|
||||
This section aims to document the build and developer workflow requirements
|
||||
for WebIDL.
|
||||
|
||||
Parser unit tests
|
||||
There are parser tests provided by ``dom/bindings/parser/runtests.py``
|
||||
that should run as part of ``make check``. There must be a mechanism
|
||||
to run the tests in *human* mode so they output friendly error
|
||||
messages.
|
||||
|
||||
Mochitests
|
||||
There are various mochitests under ``dom/bindings/test``. They should
|
||||
be runnable through the standard mechanisms.
|
||||
|
||||
Test interfaces are generated as part of the build
|
||||
``TestExampleGenBinding.cpp`` calls into methods from the
|
||||
``TestExampleInterface`` and ``TestExampleProxyInterface`` interfaces.
|
||||
These interfaces need to be generated as part of the build.
|
||||
|
||||
Running tests automatically rebuilds
|
||||
When a developer runs the WebIDL tests, she expects any necessary rebuilds
|
||||
to occur.
|
||||
|
||||
This is faciliated through ``mach webidl-test``.
|
||||
|
||||
Minimal rebuilds
|
||||
Reprocessing every output for every change is expensive. So we don't
|
||||
inconvenience people changing ``.webidl`` files, the build system
|
||||
should only perform a minimal rebuild when sources change.
|
||||
|
||||
Explicit method for performing codegen
|
||||
There needs to be an explicit method for incurring code generation.
|
||||
It needs to cover regular and test files.
|
||||
|
||||
This is implemented via ``make export`` in ``dom/bindings``.
|
||||
|
||||
No-op binding generation should be fast
|
||||
So developers touching ``.webidl`` files are not inconvenienced,
|
||||
no-op binding generation should be fast. Watch out for the build system
|
||||
processing large dependency files it doesn't need in order to perform
|
||||
code generation.
|
||||
|
||||
Ability to generate example files
|
||||
*Any* interface can have example ``.h``/``.cpp`` files generated.
|
||||
There must be a mechanism to facilitate this.
|
||||
|
||||
This is currently facilitated through ``mach webidl-example``.
|
|
@ -13,8 +13,6 @@ mock.pth:python/mock-1.0.0
|
|||
mozilla.pth:build
|
||||
mozilla.pth:config
|
||||
mozilla.pth:xpcom/typelib/xpt/tools
|
||||
mozilla.pth:dom/bindings
|
||||
mozilla.pth:dom/bindings/parser
|
||||
copy:build/buildconfig.py
|
||||
packages.txt:testing/mozbase/packages.txt
|
||||
objdir:build
|
||||
|
|
|
@ -0,0 +1,98 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this file,
|
||||
# You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
import os
|
||||
import cPickle
|
||||
from Configuration import Configuration
|
||||
from Codegen import CGBindingRoot, replaceFileIfChanged, CGEventRoot
|
||||
from mozbuild.makeutil import Makefile
|
||||
from mozbuild.pythonutil import iter_modules_in_path
|
||||
from buildconfig import topsrcdir
|
||||
|
||||
|
||||
def generate_binding_files(config, outputprefix, srcprefix, webidlfile,
|
||||
generatedEventsWebIDLFiles):
|
||||
"""
|
||||
|config| Is the configuration object.
|
||||
|outputprefix| is a prefix to use for the header guards and filename.
|
||||
"""
|
||||
|
||||
depsname = ".deps/" + outputprefix + ".pp"
|
||||
root = CGBindingRoot(config, outputprefix, webidlfile)
|
||||
replaceFileIfChanged(outputprefix + ".h", root.declare())
|
||||
replaceFileIfChanged(outputprefix + ".cpp", root.define())
|
||||
|
||||
if webidlfile in generatedEventsWebIDLFiles:
|
||||
eventName = webidlfile[:-len(".webidl")]
|
||||
generatedEvent = CGEventRoot(config, eventName)
|
||||
replaceFileIfChanged(eventName + ".h", generatedEvent.declare())
|
||||
replaceFileIfChanged(eventName + ".cpp", generatedEvent.define())
|
||||
|
||||
mk = Makefile()
|
||||
# NOTE: it's VERY important that we output dependencies for the FooBinding
|
||||
# file here, not for the header or generated cpp file. These dependencies
|
||||
# are used later to properly determine changedDeps and prevent rebuilding
|
||||
# too much. See the comment explaining $(binding_dependency_trackers) in
|
||||
# Makefile.in.
|
||||
rule = mk.create_rule([outputprefix])
|
||||
rule.add_dependencies(os.path.join(srcprefix, x) for x in sorted(root.deps()))
|
||||
rule.add_dependencies(iter_modules_in_path(topsrcdir))
|
||||
with open(depsname, 'w') as f:
|
||||
mk.dump(f)
|
||||
|
||||
def main():
|
||||
# Parse arguments.
|
||||
from optparse import OptionParser
|
||||
usagestring = "usage: %prog [header|cpp] configFile outputPrefix srcPrefix webIDLFile"
|
||||
o = OptionParser(usage=usagestring)
|
||||
o.add_option("--verbose-errors", action='store_true', default=False,
|
||||
help="When an error happens, display the Python traceback.")
|
||||
(options, args) = o.parse_args()
|
||||
|
||||
configFile = os.path.normpath(args[0])
|
||||
srcPrefix = os.path.normpath(args[1])
|
||||
|
||||
# Load the configuration
|
||||
f = open('ParserResults.pkl', 'rb')
|
||||
config = cPickle.load(f)
|
||||
f.close()
|
||||
|
||||
def readFile(f):
|
||||
file = open(f, 'rb')
|
||||
try:
|
||||
contents = file.read()
|
||||
finally:
|
||||
file.close()
|
||||
return contents
|
||||
allWebIDLFiles = readFile(args[2]).split()
|
||||
generatedEventsWebIDLFiles = readFile(args[3]).split()
|
||||
changedDeps = readFile(args[4]).split()
|
||||
|
||||
if all(f.endswith("Binding") or f == "ParserResults.pkl" for f in changedDeps):
|
||||
toRegenerate = filter(lambda f: f.endswith("Binding"), changedDeps)
|
||||
if len(toRegenerate) == 0 and len(changedDeps) == 1:
|
||||
# Work around build system bug 874923: if we get here that means
|
||||
# that changedDeps contained only one entry and it was
|
||||
# "ParserResults.pkl". That should never happen: if the
|
||||
# ParserResults.pkl changes then either one of the globalgen files
|
||||
# changed (in which case we wouldn't be in this "only
|
||||
# ParserResults.pkl and *Binding changed" code) or some .webidl
|
||||
# files changed (and then the corresponding *Binding files should
|
||||
# show up in changedDeps). Since clearly the build system is
|
||||
# confused, just regenerate everything to be safe.
|
||||
toRegenerate = allWebIDLFiles
|
||||
else:
|
||||
toRegenerate = map(lambda f: f[:-len("Binding")] + ".webidl",
|
||||
toRegenerate)
|
||||
else:
|
||||
toRegenerate = allWebIDLFiles
|
||||
|
||||
for webIDLFile in toRegenerate:
|
||||
assert webIDLFile.endswith(".webidl")
|
||||
outputPrefix = webIDLFile[:-len(".webidl")] + "Binding"
|
||||
generate_binding_files(config, outputPrefix, srcPrefix, webIDLFile,
|
||||
generatedEventsWebIDLFiles);
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -26,6 +26,26 @@ NEWRESOLVE_HOOK_NAME = '_newResolve'
|
|||
ENUMERATE_HOOK_NAME= '_enumerate'
|
||||
ENUM_ENTRY_VARIABLE_NAME = 'strings'
|
||||
|
||||
def replaceFileIfChanged(filename, newContents):
|
||||
"""
|
||||
Read a copy of the old file, so that we don't touch it if it hasn't changed.
|
||||
Returns True if the file was updated, false otherwise.
|
||||
"""
|
||||
oldFileContents = ""
|
||||
try:
|
||||
oldFile = open(filename, 'rb')
|
||||
oldFileContents = ''.join(oldFile.readlines())
|
||||
oldFile.close()
|
||||
except:
|
||||
pass
|
||||
|
||||
if newContents == oldFileContents:
|
||||
return False
|
||||
|
||||
f = open(filename, 'wb')
|
||||
f.write(newContents)
|
||||
f.close()
|
||||
return True
|
||||
|
||||
def toStringBool(arg):
|
||||
return str(not not arg).lower()
|
||||
|
|
|
@ -0,0 +1,46 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this file,
|
||||
# You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
import os
|
||||
import cPickle
|
||||
from Configuration import Configuration
|
||||
from Codegen import CGExampleRoot, replaceFileIfChanged
|
||||
|
||||
def generate_interface_example(config, interfaceName):
|
||||
"""
|
||||
|config| Is the configuration object.
|
||||
|interfaceName| is the name of the interface we're generating an example for.
|
||||
"""
|
||||
|
||||
root = CGExampleRoot(config, interfaceName)
|
||||
exampleHeader = interfaceName + "-example.h"
|
||||
exampleImpl = interfaceName + "-example.cpp"
|
||||
replaceFileIfChanged(exampleHeader, root.declare())
|
||||
replaceFileIfChanged(exampleImpl, root.define())
|
||||
|
||||
def main():
|
||||
|
||||
# Parse arguments.
|
||||
from optparse import OptionParser
|
||||
usagestring = "usage: %prog configFile interfaceName"
|
||||
o = OptionParser(usage=usagestring)
|
||||
o.add_option("--verbose-errors", action='store_true', default=False,
|
||||
help="When an error happens, display the Python traceback.")
|
||||
(options, args) = o.parse_args()
|
||||
|
||||
if len(args) != 2:
|
||||
o.error(usagestring)
|
||||
configFile = os.path.normpath(args[0])
|
||||
interfaceName = args[1]
|
||||
|
||||
# Load the configuration
|
||||
f = open('ParserResults.pkl', 'rb')
|
||||
config = cPickle.load(f)
|
||||
f.close()
|
||||
|
||||
# Generate the example class.
|
||||
generate_interface_example(config, interfaceName)
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -0,0 +1,81 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this file,
|
||||
# You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
# We do one global pass over all the WebIDL to generate our prototype enum
|
||||
# and generate information for subsequent phases.
|
||||
|
||||
import os
|
||||
import WebIDL
|
||||
import cPickle
|
||||
from Configuration import Configuration
|
||||
from Codegen import GlobalGenRoots, replaceFileIfChanged
|
||||
|
||||
def generate_file(config, name, action):
|
||||
|
||||
root = getattr(GlobalGenRoots, name)(config)
|
||||
if action is 'declare':
|
||||
filename = name + '.h'
|
||||
code = root.declare()
|
||||
else:
|
||||
assert action is 'define'
|
||||
filename = name + '.cpp'
|
||||
code = root.define()
|
||||
|
||||
if replaceFileIfChanged(filename, code):
|
||||
print "Generating %s" % (filename)
|
||||
else:
|
||||
print "%s hasn't changed - not touching it" % (filename)
|
||||
|
||||
def main():
|
||||
# Parse arguments.
|
||||
from optparse import OptionParser
|
||||
usageString = "usage: %prog [options] webidldir [files]"
|
||||
o = OptionParser(usage=usageString)
|
||||
o.add_option("--cachedir", dest='cachedir', default=None,
|
||||
help="Directory in which to cache lex/parse tables.")
|
||||
o.add_option("--verbose-errors", action='store_true', default=False,
|
||||
help="When an error happens, display the Python traceback.")
|
||||
(options, args) = o.parse_args()
|
||||
|
||||
if len(args) < 2:
|
||||
o.error(usageString)
|
||||
|
||||
configFile = args[0]
|
||||
baseDir = args[1]
|
||||
fileList = args[2:]
|
||||
|
||||
# Parse the WebIDL.
|
||||
parser = WebIDL.Parser(options.cachedir)
|
||||
for filename in fileList:
|
||||
fullPath = os.path.normpath(os.path.join(baseDir, filename))
|
||||
f = open(fullPath, 'rb')
|
||||
lines = f.readlines()
|
||||
f.close()
|
||||
parser.parse(''.join(lines), fullPath)
|
||||
parserResults = parser.finish()
|
||||
|
||||
# Load the configuration.
|
||||
config = Configuration(configFile, parserResults)
|
||||
|
||||
# Write the configuration out to a pickle.
|
||||
resultsFile = open('ParserResults.pkl', 'wb')
|
||||
cPickle.dump(config, resultsFile, -1)
|
||||
resultsFile.close()
|
||||
|
||||
# Generate the atom list.
|
||||
generate_file(config, 'GeneratedAtomList', 'declare')
|
||||
|
||||
# Generate the prototype list.
|
||||
generate_file(config, 'PrototypeList', 'declare')
|
||||
|
||||
# Generate the common code.
|
||||
generate_file(config, 'RegisterBindings', 'declare')
|
||||
generate_file(config, 'RegisterBindings', 'define')
|
||||
|
||||
generate_file(config, 'UnionTypes', 'declare')
|
||||
generate_file(config, 'UnionTypes', 'define')
|
||||
generate_file(config, 'UnionConversions', 'declare')
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -1,78 +1,249 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
abs_dist := $(abspath $(DIST))
|
||||
webidl_base := $(topsrcdir)/dom/webidl
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this file,
|
||||
# You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
webidl_base = $(topsrcdir)/dom/webidl
|
||||
# Generated by moz.build
|
||||
include webidlsrcs.mk
|
||||
|
||||
binding_include_path := mozilla/dom
|
||||
webidl_files += $(generated_events_webidl_files)
|
||||
all_webidl_files = $(webidl_files) $(generated_webidl_files) $(preprocessed_webidl_files)
|
||||
|
||||
# Set exported_binding_headers before adding the test IDL to the mix
|
||||
exported_binding_headers := $(subst .webidl,Binding.h,$(all_webidl_files))
|
||||
exported_generated_events_headers := $(subst .webidl,.h,$(generated_events_webidl_files))
|
||||
|
||||
# Set linked_binding_cpp_files before adding the test IDL to the mix
|
||||
linked_binding_cpp_files := $(subst .webidl,Binding.cpp,$(all_webidl_files))
|
||||
linked_generated_events_cpp_files := $(subst .webidl,.cpp,$(generated_events_webidl_files))
|
||||
|
||||
all_webidl_files += $(test_webidl_files) $(preprocessed_test_webidl_files)
|
||||
|
||||
generated_header_files := $(subst .webidl,Binding.h,$(all_webidl_files)) $(exported_generated_events_headers)
|
||||
generated_cpp_files := $(subst .webidl,Binding.cpp,$(all_webidl_files)) $(linked_generated_events_cpp_files)
|
||||
|
||||
# We want to be able to only regenerate the .cpp and .h files that really need
|
||||
# to change when a .webidl file changes. We do this by making the
|
||||
# binding_dependency_trackers targets have dependencies on the right .webidl
|
||||
# files via generated .pp files, having a .BindingGen target that depends on the
|
||||
# binding_dependency_trackers and which has all the generated binding .h/.cpp
|
||||
# depending on it, and then in the make commands for that target being able to
|
||||
# check which exact binding_dependency_trackers changed.
|
||||
binding_dependency_trackers := $(subst .webidl,Binding,$(all_webidl_files))
|
||||
|
||||
globalgen_targets := \
|
||||
GeneratedAtomList.h \
|
||||
PrototypeList.h \
|
||||
RegisterBindings.h \
|
||||
RegisterBindings.cpp \
|
||||
UnionTypes.h \
|
||||
UnionTypes.cpp \
|
||||
UnionConversions.h \
|
||||
$(NULL)
|
||||
|
||||
# Nasty hack: when the test/Makefile.in invokes us to do codegen, it
|
||||
# uses a target of
|
||||
# "export TestExampleInterface-example TestExampleProxyInterface-example".
|
||||
# We don't actually need to load our .o.pp files in that case, so just
|
||||
# pretend like we have no CPPSRCS if that's the target. It makes the
|
||||
# test cycle much faster, which is why we're doing it.
|
||||
#
|
||||
# XXXbz We could try to cheat even more and only include our CPPSRCS
|
||||
# when $(MAKECMDGOALS) contains libs, so that we can skip loading all
|
||||
# those .o.pp when trying to make a single .cpp file too, but that
|
||||
# would break |make FooBinding.o(bj)|. Ah, well.
|
||||
ifneq (export TestExampleInterface-example TestExampleProxyInterface-example,$(MAKECMDGOALS))
|
||||
CPPSRCS = \
|
||||
$(unified_binding_cpp_files) \
|
||||
$(linked_generated_events_cpp_files) \
|
||||
$(filter %.cpp, $(globalgen_targets)) \
|
||||
BindingUtils.cpp \
|
||||
CallbackInterface.cpp \
|
||||
CallbackObject.cpp \
|
||||
DOMJSProxyHandler.cpp \
|
||||
Date.cpp \
|
||||
Exceptions.cpp \
|
||||
$(NULL)
|
||||
endif
|
||||
|
||||
ABS_DIST := $(abspath $(DIST))
|
||||
|
||||
EXTRA_EXPORT_MDDEPEND_FILES := $(addsuffix .pp,$(binding_dependency_trackers))
|
||||
|
||||
EXPORTS_GENERATED_FILES := $(exported_binding_headers) $(exported_generated_events_headers)
|
||||
EXPORTS_GENERATED_DEST := $(ABS_DIST)/include/$(binding_include_path)
|
||||
EXPORTS_GENERATED_TARGET := export
|
||||
INSTALL_TARGETS += EXPORTS_GENERATED
|
||||
|
||||
# Install auto-generated GlobalGen files. The rules for the install must
|
||||
# be in the same target/subtier as GlobalGen.py, otherwise the files will not
|
||||
# get installed into the appropriate location as they are generated.
|
||||
globalgen_headers_FILES := \
|
||||
GeneratedAtomList.h \
|
||||
PrototypeList.h \
|
||||
RegisterBindings.h \
|
||||
UnionConversions.h \
|
||||
UnionTypes.h \
|
||||
$(NULL)
|
||||
globalgen_headers_DEST = $(ABS_DIST)/include/mozilla/dom
|
||||
globalgen_headers_TARGET := export
|
||||
INSTALL_TARGETS += globalgen_headers
|
||||
|
||||
include $(topsrcdir)/config/rules.mk
|
||||
|
||||
ifdef GNU_CC
|
||||
CXXFLAGS += -Wno-uninitialized
|
||||
endif
|
||||
|
||||
# These come from webidlsrcs.mk.
|
||||
CPPSRCS += $(globalgen_sources) $(unified_binding_cpp_files)
|
||||
|
||||
# Generated bindings reference *Binding.h, not mozilla/dom/*Binding.h. And,
|
||||
# since we generate exported bindings directly to $(DIST)/include, we need
|
||||
# to add that path to the search list. We need to ensure the $(DIST) path
|
||||
# occurs before '.' because old builds generated .h files into '.'
|
||||
# before copying them to $(DIST). Those old .h files won't get updated
|
||||
# any more and thus using them could result in build failures due to
|
||||
# mismatches. This consideration shouldn't be relevant after CLOBBER
|
||||
# is touched.
|
||||
#
|
||||
# Ideally, binding generation uses the prefixed header file names.
|
||||
# Bug 932092 tracks.
|
||||
LOCAL_INCLUDES += -I$(DIST)/include/mozilla/dom
|
||||
|
||||
PYTHON_UNIT_TESTS += $(srcdir)/mozwebidl/test/test_mozwebidl.py
|
||||
|
||||
include $(topsrcdir)/config/rules.mk
|
||||
|
||||
|
||||
css2properties_dependencies = \
|
||||
$(topsrcdir)/layout/style/nsCSSPropList.h \
|
||||
$(topsrcdir)/layout/style/nsCSSPropAliasList.h \
|
||||
$(webidl_base)/CSS2Properties.webidl.in \
|
||||
$(webidl_base)/CSS2PropertiesProps.h \
|
||||
$(srcdir)/GenerateCSS2PropertiesWebIDL.py \
|
||||
# If you change bindinggen_dependencies here, change it in
|
||||
# dom/bindings/test/Makefile.in too.
|
||||
bindinggen_dependencies := \
|
||||
BindingGen.py \
|
||||
Bindings.conf \
|
||||
Configuration.py \
|
||||
Codegen.py \
|
||||
ParserResults.pkl \
|
||||
parser/WebIDL.py \
|
||||
$(GLOBAL_DEPS) \
|
||||
$(NULL)
|
||||
|
||||
CSS2Properties.webidl: $(css2properties_dependencies)
|
||||
$(CPP) $(DEFINES) $(ACDEFINES) -I$(topsrcdir)/layout/style \
|
||||
$(webidl_base)/CSS2PropertiesProps.h | \
|
||||
PYTHONDONTWRITEBYTECODE=1 $(PYTHON) \
|
||||
$(srcdir)/GenerateCSS2PropertiesWebIDL.py \
|
||||
$(webidl_base)/CSS2Properties.webidl.in > $@
|
||||
CSS2Properties.webidl: $(topsrcdir)/layout/style/nsCSSPropList.h \
|
||||
$(topsrcdir)/layout/style/nsCSSPropAliasList.h \
|
||||
$(webidl_base)/CSS2Properties.webidl.in \
|
||||
$(webidl_base)/CSS2PropertiesProps.h \
|
||||
$(srcdir)/GenerateCSS2PropertiesWebIDL.py \
|
||||
$(GLOBAL_DEPS)
|
||||
$(CPP) $(DEFINES) $(ACDEFINES) -I$(topsrcdir)/layout/style $(webidl_base)/CSS2PropertiesProps.h | \
|
||||
PYTHONDONTWRITEBYTECODE=1 $(PYTHON) \
|
||||
$(srcdir)/GenerateCSS2PropertiesWebIDL.py $(webidl_base)/CSS2Properties.webidl.in > CSS2Properties.webidl
|
||||
|
||||
# Most of the logic for dependencies lives inside Python so it can be
|
||||
# used by multiple build backends. We simply have rules to generate
|
||||
# and include the .pp file. This will pull in additional dependencies
|
||||
# on codegen.pp which will cause any .webidl or .py file change to
|
||||
# result in regeneration.
|
||||
codegen_dependencies := \
|
||||
$(nonstatic_webidl_files) \
|
||||
$(GLOBAL_DEPS) \
|
||||
$(NULL)
|
||||
$(webidl_files): %: $(webidl_base)/%
|
||||
$(INSTALL) $(IFLAGS1) $(webidl_base)/$* .
|
||||
|
||||
include codegen.pp
|
||||
$(test_webidl_files): %: $(srcdir)/test/%
|
||||
$(INSTALL) $(IFLAGS1) $(srcdir)/test/$* .
|
||||
|
||||
codegen.pp: $(codegen_dependencies)
|
||||
$(call py_action,webidl,$(srcdir))
|
||||
# We can't easily use PP_TARGETS here because it insists on outputting targets
|
||||
# that look like "$(CURDIR)/foo" whereas we want our target to just be "foo".
|
||||
# Make sure to include $(GLOBAL_DEPS) so we pick up changes to what symbols are
|
||||
# defined. Also make sure to remove $@ before writing to it, because otherwise
|
||||
# if a file goes from non-preprocessed to preprocessed we can end up writing to
|
||||
# a symlink, which will clobber files in the srcdir, which is bad.
|
||||
$(preprocessed_webidl_files): %: $(webidl_base)/% $(GLOBAL_DEPS)
|
||||
$(RM) $@
|
||||
$(call py_action,preprocessor, \
|
||||
$(DEFINES) $(ACDEFINES) $(XULPPFLAGS) $(webidl_base)/$* -o $@)
|
||||
|
||||
# See the comment about PP_TARGETS for $(preprocessed_webidl_files)
|
||||
$(preprocessed_test_webidl_files): %: $(srcdir)/test/% $(GLOBAL_DEPS)
|
||||
$(RM) $@
|
||||
$(call py_action,preprocessor, \
|
||||
$(DEFINES) $(ACDEFINES) $(XULPPFLAGS) $(srcdir)/test/$* -o $@)
|
||||
|
||||
# Make is dumb and can get confused between "foo" and "$(CURDIR)/foo". Make
|
||||
# sure that the latter depends on the former, since the latter gets used in .pp
|
||||
# files.
|
||||
all_webidl_files_absolute = $(addprefix $(CURDIR)/,$(all_webidl_files))
|
||||
$(all_webidl_files_absolute): $(CURDIR)/%: %
|
||||
|
||||
$(generated_header_files): .BindingGen
|
||||
|
||||
$(generated_cpp_files): .BindingGen
|
||||
|
||||
# $(binding_dependency_trackers) pick up additional dependencies via .pp files
|
||||
# The rule: just brings the tracker up to date, if it's out of date, so that
|
||||
# we'll know that we have to redo binding generation and flag this prerequisite
|
||||
# there as being newer than the bindinggen target.
|
||||
$(binding_dependency_trackers):
|
||||
@$(TOUCH) $@
|
||||
|
||||
export:: codegen.pp
|
||||
$(globalgen_targets): ParserResults.pkl
|
||||
|
||||
%-example: .BindingGen
|
||||
PYTHONDONTWRITEBYTECODE=1 $(PYTHON) $(topsrcdir)/config/pythonpath.py \
|
||||
$(PLY_INCLUDE) -I$(srcdir)/parser \
|
||||
$(srcdir)/ExampleGen.py \
|
||||
$(srcdir)/Bindings.conf $*
|
||||
|
||||
CACHE_DIR = _cache
|
||||
|
||||
globalgen_dependencies := \
|
||||
GlobalGen.py \
|
||||
Bindings.conf \
|
||||
Configuration.py \
|
||||
Codegen.py \
|
||||
parser/WebIDL.py \
|
||||
webidlsrcs.mk \
|
||||
$(all_webidl_files) \
|
||||
$(CACHE_DIR)/.done \
|
||||
$(GLOBAL_DEPS) \
|
||||
$(NULL)
|
||||
|
||||
$(CACHE_DIR)/.done:
|
||||
$(MKDIR) -p $(CACHE_DIR)
|
||||
@$(TOUCH) $@
|
||||
|
||||
# Running GlobalGen.py updates ParserResults.pkl as a side-effect
|
||||
ParserResults.pkl: $(globalgen_dependencies)
|
||||
$(info Generating global WebIDL files)
|
||||
PYTHONDONTWRITEBYTECODE=1 $(PYTHON) $(topsrcdir)/config/pythonpath.py \
|
||||
$(PLY_INCLUDE) -I$(srcdir)/parser \
|
||||
$(srcdir)/GlobalGen.py $(srcdir)/Bindings.conf . \
|
||||
--cachedir=$(CACHE_DIR) \
|
||||
$(all_webidl_files)
|
||||
|
||||
$(globalgen_headers_FILES): ParserResults.pkl
|
||||
|
||||
# Make sure .deps actually exists, since we'll try to write to it from
|
||||
# BindingGen.py but we're typically running in the export phase, which is
|
||||
# before anyone has bothered creating .deps.
|
||||
# Then, pass our long lists through files to try to avoid blowing out the
|
||||
# command line.
|
||||
# Next, BindingGen.py will examine the changed dependency list to figure out
|
||||
# what it really needs to regenerate.
|
||||
# Finally, touch the .BindingGen file so that we don't have to keep redoing
|
||||
# all that until something else actually changes.
|
||||
.BindingGen: $(bindinggen_dependencies) $(binding_dependency_trackers)
|
||||
$(info Generating WebIDL bindings)
|
||||
$(MKDIR) -p .deps
|
||||
echo $(all_webidl_files) > .all-webidl-file-list
|
||||
echo $(generated_events_webidl_files) > .generated-events-webidl-files
|
||||
echo $? > .changed-dependency-list
|
||||
PYTHONDONTWRITEBYTECODE=1 $(PYTHON) $(topsrcdir)/config/pythonpath.py \
|
||||
$(PLY_INCLUDE) -I$(srcdir)/parser \
|
||||
$(srcdir)/BindingGen.py \
|
||||
$(srcdir)/Bindings.conf \
|
||||
$(CURDIR) \
|
||||
.all-webidl-file-list \
|
||||
.generated-events-webidl-files \
|
||||
.changed-dependency-list
|
||||
@$(TOUCH) $@
|
||||
|
||||
GARBAGE += \
|
||||
codegen.pp \
|
||||
codegen.json \
|
||||
webidlyacc.py \
|
||||
parser.out \
|
||||
WebIDLGrammar.pkl \
|
||||
$(wildcard *.h) \
|
||||
$(wildcard *.cpp) \
|
||||
$(wildcard *.webidl) \
|
||||
$(wildcard *-example.h) \
|
||||
$(wildcard *-example.cpp) \
|
||||
.BindingGen \
|
||||
.all-webidl-file-list \
|
||||
.generated-events-webidl-files \
|
||||
.changed-dependency-list \
|
||||
$(binding_dependency_trackers) \
|
||||
$(NULL)
|
||||
|
||||
# Make sure all binding header files are created during the export stage, so we
|
||||
# don't have issues with .cpp files being compiled before we've generated the
|
||||
# headers they depend on. This is really only needed for the test files, since
|
||||
# the non-test headers are all exported above anyway. Note that this means that
|
||||
# we do all of our codegen during export.
|
||||
export:: $(generated_header_files)
|
||||
|
||||
distclean::
|
||||
-$(RM) \
|
||||
$(generated_header_files) \
|
||||
$(generated_cpp_files) \
|
||||
$(all_webidl_files) \
|
||||
$(globalgen_targets) \
|
||||
ParserResults.pkl \
|
||||
$(NULL)
|
||||
|
|
|
@ -18,17 +18,6 @@ from mozbuild.base import MachCommandBase
|
|||
|
||||
@CommandProvider
|
||||
class WebIDLProvider(MachCommandBase):
|
||||
@Command('webidl-example', category='misc',
|
||||
description='Generate example files for a WebIDL interface.')
|
||||
@CommandArgument('interface', nargs='+',
|
||||
help='Interface(s) whose examples to generate.')
|
||||
def webidl_example(self, interface):
|
||||
from mozwebidl import BuildSystemWebIDL
|
||||
|
||||
manager = self._spawn(BuildSystemWebIDL).manager
|
||||
for i in interface:
|
||||
manager.generate_example_files(i)
|
||||
|
||||
@Command('webidl-parser-test', category='testing',
|
||||
description='Run WebIDL tests.')
|
||||
@CommandArgument('--verbose', '-v', action='store_true',
|
||||
|
|
|
@ -4,8 +4,6 @@
|
|||
# License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
TEST_DIRS += ['test']
|
||||
|
||||
EXPORTS.mozilla += [
|
||||
'ErrorResult.h',
|
||||
]
|
||||
|
@ -70,15 +68,6 @@ LOCAL_INCLUDES += [
|
|||
'/media/webrtc/signaling/src/peerconnection',
|
||||
]
|
||||
|
||||
SOURCES += [
|
||||
'BindingUtils.cpp',
|
||||
'CallbackInterface.cpp',
|
||||
'CallbackObject.cpp',
|
||||
'Date.cpp',
|
||||
'DOMJSProxyHandler.cpp',
|
||||
'Exceptions.cpp',
|
||||
]
|
||||
|
||||
include('/ipc/chromium/chromium-config.mozbuild')
|
||||
|
||||
if CONFIG['MOZ_AUDIO_CHANNEL_MANAGER']:
|
||||
|
|
|
@ -1,563 +0,0 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
# This module contains code for managing WebIDL files and bindings for
|
||||
# the build system.
|
||||
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import errno
|
||||
import hashlib
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
|
||||
from copy import deepcopy
|
||||
|
||||
from mach.mixin.logging import LoggingMixin
|
||||
|
||||
from mozbuild.base import MozbuildObject
|
||||
from mozbuild.makeutil import Makefile
|
||||
from mozbuild.pythonutil import iter_modules_in_path
|
||||
from mozbuild.util import FileAvoidWrite
|
||||
|
||||
import WebIDL
|
||||
from Codegen import (
|
||||
CGBindingRoot,
|
||||
CGEventRoot,
|
||||
CGExampleRoot,
|
||||
GlobalGenRoots,
|
||||
)
|
||||
from Configuration import Configuration
|
||||
|
||||
|
||||
class BuildResult(object):
|
||||
"""Represents the result of building WebIDL files.
|
||||
|
||||
This holds a summary of output file generation during a build.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
# The .webidl files that had their outputs regenerated.
|
||||
self.inputs = set()
|
||||
|
||||
# The output files that were created.
|
||||
self.created = set()
|
||||
|
||||
# The output files that changed.
|
||||
self.updated = set()
|
||||
|
||||
# The output files that didn't change.
|
||||
self.unchanged = set()
|
||||
|
||||
|
||||
class WebIDLCodegenManagerState(dict):
|
||||
"""Holds state for the WebIDL code generation manager.
|
||||
|
||||
State is currently just an extended dict. The internal implementation of
|
||||
state should be considered a black box to everyone except
|
||||
WebIDLCodegenManager. But we'll still document it.
|
||||
|
||||
Fields:
|
||||
|
||||
version
|
||||
The integer version of the format. This is to detect incompatible
|
||||
changes between state. It should be bumped whenever the format
|
||||
changes or semantics change.
|
||||
|
||||
webidls
|
||||
A dictionary holding information about every known WebIDL input.
|
||||
Keys are the basenames of input WebIDL files. Values are dicts of
|
||||
metadata. Keys in those dicts are:
|
||||
|
||||
* filename - The full path to the input filename.
|
||||
* inputs - A set of full paths to other webidl files this webidl
|
||||
depends on.
|
||||
* outputs - Set of full output paths that are created/derived from
|
||||
this file.
|
||||
* sha1 - The hexidecimal SHA-1 of the input filename from the last
|
||||
processing time.
|
||||
|
||||
global_inputs
|
||||
A dictionary defining files that influence all processing. Keys
|
||||
are full filenames. Values are hexidecimal SHA-1 from the last
|
||||
processing time.
|
||||
"""
|
||||
|
||||
VERSION = 1
|
||||
|
||||
def __init__(self, fh=None):
|
||||
self['version'] = self.VERSION
|
||||
self['webidls'] = {}
|
||||
self['global_depends'] = {}
|
||||
|
||||
if not fh:
|
||||
return
|
||||
|
||||
state = json.load(fh)
|
||||
if state['version'] != self.VERSION:
|
||||
raise Exception('Unknown state version: %s' % state['version'])
|
||||
|
||||
self['version'] = state['version']
|
||||
self['global_depends'] = state['global_depends']
|
||||
|
||||
for k, v in state['webidls'].items():
|
||||
self['webidls'][k] = v
|
||||
|
||||
# Sets are converted to lists for serialization because JSON
|
||||
# doesn't support sets.
|
||||
self['webidls'][k]['inputs'] = set(v['inputs'])
|
||||
self['webidls'][k]['outputs'] = set(v['outputs'])
|
||||
|
||||
def dump(self, fh):
|
||||
"""Dump serialized state to a file handle."""
|
||||
normalized = deepcopy(self)
|
||||
|
||||
for k, v in self['webidls'].items():
|
||||
# Convert sets to lists because JSON doesn't support sets.
|
||||
normalized['webidls'][k]['outputs'] = sorted(v['outputs'])
|
||||
normalized['webidls'][k]['inputs'] = sorted(v['inputs'])
|
||||
|
||||
json.dump(normalized, fh, sort_keys=True)
|
||||
|
||||
|
||||
class WebIDLCodegenManager(LoggingMixin):
|
||||
"""Manages all things WebIDL.
|
||||
|
||||
This object is meant to be generic and reusable. Paths, etc should be
|
||||
parameters and not hardcoded.
|
||||
"""
|
||||
|
||||
# Global parser derived declaration files.
|
||||
GLOBAL_DECLARE_FILES = {
|
||||
'GeneratedAtomList.h',
|
||||
'PrototypeList.h',
|
||||
'RegisterBindings.h',
|
||||
'UnionConversions.h',
|
||||
'UnionTypes.h',
|
||||
}
|
||||
|
||||
# Global parser derived definition files.
|
||||
GLOBAL_DEFINE_FILES = {
|
||||
'RegisterBindings.cpp',
|
||||
'UnionTypes.cpp',
|
||||
}
|
||||
|
||||
# Example interfaces to build along with the tree. Other example
|
||||
# interfaces will need to be generated manually.
|
||||
BUILD_EXAMPLE_INTERFACES = {
|
||||
'TestExampleInterface',
|
||||
'TestExampleProxyInterface',
|
||||
}
|
||||
|
||||
def __init__(self, config_path, inputs, exported_header_dir,
|
||||
codegen_dir, state_path, cache_dir=None, make_deps_path=None,
|
||||
make_deps_target=None):
|
||||
"""Create an instance that manages WebIDLs in the build system.
|
||||
|
||||
config_path refers to a WebIDL config file (e.g. Bindings.conf).
|
||||
inputs is a 3-tuple describing the input .webidl files and how to
|
||||
process them. Members are:
|
||||
(set(.webidl files), set(basenames of exported files),
|
||||
set(basenames of generated events files))
|
||||
|
||||
exported_header_dir and codegen_dir are directories where generated
|
||||
files will be written to.
|
||||
state_path is the path to a file that will receive JSON state from our
|
||||
actions.
|
||||
make_deps_path is the path to a make dependency file that we can
|
||||
optionally write.
|
||||
make_deps_target is the target that receives the make dependencies. It
|
||||
must be defined if using make_deps_path.
|
||||
"""
|
||||
self.populate_logger()
|
||||
|
||||
input_paths, exported_stems, generated_events_stems = inputs
|
||||
|
||||
self._config_path = config_path
|
||||
self._input_paths = set(input_paths)
|
||||
self._exported_stems = set(exported_stems)
|
||||
self._generated_events_stems = set(generated_events_stems)
|
||||
self._exported_header_dir = exported_header_dir
|
||||
self._codegen_dir = codegen_dir
|
||||
self._state_path = state_path
|
||||
self._cache_dir = cache_dir
|
||||
self._make_deps_path = make_deps_path
|
||||
self._make_deps_target = make_deps_target
|
||||
|
||||
if (make_deps_path and not make_deps_target) or (not make_deps_path and
|
||||
make_deps_target):
|
||||
raise Exception('Must define both make_deps_path and make_deps_target '
|
||||
'if one is defined.')
|
||||
|
||||
self._parser_results = None
|
||||
self._config = None
|
||||
self._state = WebIDLCodegenManagerState()
|
||||
|
||||
if os.path.exists(state_path):
|
||||
with open(state_path, 'rb') as fh:
|
||||
try:
|
||||
self._state = WebIDLCodegenManagerState(fh=fh)
|
||||
except Exception as e:
|
||||
self.log(logging.WARN, 'webidl_bad_state', {'msg': str(e)},
|
||||
'Bad WebIDL state: {msg}')
|
||||
|
||||
@property
|
||||
def config(self):
|
||||
if not self._config:
|
||||
self._parse_webidl()
|
||||
|
||||
return self._config
|
||||
|
||||
def generate_build_files(self):
|
||||
"""Generate files required for the build.
|
||||
|
||||
This function is in charge of generating all the .h/.cpp files derived
|
||||
from input .webidl files. Please note that there are build actions
|
||||
required to produce .webidl files and these build actions are
|
||||
explicitly not captured here: this function assumes all .webidl files
|
||||
are present and up to date.
|
||||
|
||||
This routine is called as part of the build to ensure files that need
|
||||
to exist are present and up to date. This routine may not be called if
|
||||
the build dependencies (generated as a result of calling this the first
|
||||
time) say everything is up to date.
|
||||
|
||||
Because reprocessing outputs for every .webidl on every invocation
|
||||
is expensive, we only regenerate the minimal set of files on every
|
||||
invocation. The rules for deciding what needs done are roughly as
|
||||
follows:
|
||||
|
||||
1. If any .webidl changes, reparse all .webidl files and regenerate
|
||||
the global derived files. Only regenerate output files (.h/.cpp)
|
||||
impacted by the modified .webidl files.
|
||||
2. If an non-.webidl dependency (Python files, config file) changes,
|
||||
assume everything is out of date and regenerate the world. This
|
||||
is because changes in those could globally impact every output
|
||||
file.
|
||||
3. If an output file is missing, ensure it is present by performing
|
||||
necessary regeneration.
|
||||
"""
|
||||
# Despite #1 above, we assume the build system is smart enough to not
|
||||
# invoke us if nothing has changed. Therefore, any invocation means
|
||||
# something has changed. And, if anything has changed, we need to
|
||||
# parse the WebIDL.
|
||||
self._parse_webidl()
|
||||
|
||||
result = BuildResult()
|
||||
|
||||
# If we parse, we always update globals - they are cheap and it is
|
||||
# easier that way.
|
||||
created, updated, unchanged = self._write_global_derived()
|
||||
result.created |= created
|
||||
result.updated |= updated
|
||||
result.unchanged |= unchanged
|
||||
|
||||
# If any of the extra dependencies changed, regenerate the world.
|
||||
global_changed, global_hashes = self._global_dependencies_changed()
|
||||
if global_changed:
|
||||
# Make a copy because we may modify.
|
||||
changed_inputs = set(self._input_paths)
|
||||
else:
|
||||
changed_inputs = self._compute_changed_inputs()
|
||||
|
||||
self._state['global_depends'] = global_hashes
|
||||
|
||||
# Generate bindings from .webidl files.
|
||||
for filename in sorted(changed_inputs):
|
||||
basename = os.path.basename(filename)
|
||||
result.inputs.add(filename)
|
||||
written, deps = self._generate_build_files_for_webidl(filename)
|
||||
result.created |= written[0]
|
||||
result.updated |= written[1]
|
||||
result.unchanged |= written[2]
|
||||
|
||||
self._state['webidls'][basename] = dict(
|
||||
filename=filename,
|
||||
outputs=written[0] | written[1] | written[2],
|
||||
inputs=set(deps),
|
||||
sha1=self._input_hashes[filename],
|
||||
)
|
||||
|
||||
# Process some special interfaces required for testing.
|
||||
for interface in self.BUILD_EXAMPLE_INTERFACES:
|
||||
written = self.generate_example_files(interface)
|
||||
result.created |= written[0]
|
||||
result.updated |= written[1]
|
||||
result.unchanged |= written[2]
|
||||
|
||||
# Generate a make dependency file.
|
||||
if self._make_deps_path:
|
||||
mk = Makefile()
|
||||
codegen_rule = mk.create_rule([self._make_deps_target])
|
||||
codegen_rule.add_dependencies(global_hashes.keys())
|
||||
codegen_rule.add_dependencies(self._input_paths)
|
||||
|
||||
with FileAvoidWrite(self._make_deps_path) as fh:
|
||||
mk.dump(fh)
|
||||
|
||||
self._save_state()
|
||||
|
||||
return result
|
||||
|
||||
def generate_example_files(self, interface):
|
||||
"""Generates example files for a given interface."""
|
||||
root = CGExampleRoot(self.config, interface)
|
||||
|
||||
return self._maybe_write_codegen(root, *self._example_paths(interface))
|
||||
|
||||
def _parse_webidl(self):
|
||||
self.log(logging.INFO, 'webidl_parse',
|
||||
{'count': len(self._input_paths)},
|
||||
'Parsing {count} WebIDL files.')
|
||||
|
||||
hashes = {}
|
||||
parser = WebIDL.Parser(self._cache_dir)
|
||||
|
||||
for path in sorted(self._input_paths):
|
||||
with open(path, 'rb') as fh:
|
||||
data = fh.read()
|
||||
hashes[path] = hashlib.sha1(data).hexdigest()
|
||||
parser.parse(data, path)
|
||||
|
||||
self._parser_results = parser.finish()
|
||||
self._config = Configuration(self._config_path, self._parser_results)
|
||||
self._input_hashes = hashes
|
||||
|
||||
def _write_global_derived(self):
|
||||
things = [('declare', f) for f in self.GLOBAL_DECLARE_FILES]
|
||||
things.extend(('define', f) for f in self.GLOBAL_DEFINE_FILES)
|
||||
|
||||
result = (set(), set(), set())
|
||||
|
||||
for what, filename in things:
|
||||
stem = os.path.splitext(filename)[0]
|
||||
root = getattr(GlobalGenRoots, stem)(self._config)
|
||||
|
||||
if what == 'declare':
|
||||
code = root.declare()
|
||||
output_root = self._exported_header_dir
|
||||
elif what == 'define':
|
||||
code = root.define()
|
||||
output_root = self._codegen_dir
|
||||
else:
|
||||
raise Exception('Unknown global gen type: %s' % what)
|
||||
|
||||
output_path = os.path.join(output_root, filename)
|
||||
self._maybe_write_file(output_path, code, result)
|
||||
|
||||
return result
|
||||
|
||||
def _compute_changed_inputs(self):
|
||||
"""Compute the set of input files that need regenerated."""
|
||||
changed_inputs = set()
|
||||
expected_outputs = self.expected_build_output_files()
|
||||
|
||||
# Look for missing output files.
|
||||
if any(not os.path.exists(f) for f in expected_outputs):
|
||||
# FUTURE Bug 940469 Only regenerate minimum set.
|
||||
changed_inputs |= self._input_paths
|
||||
|
||||
# That's it for examining output files. We /could/ examine SHA-1's of
|
||||
# output files from a previous run to detect modifications. But that's
|
||||
# a lot of extra work and most build systems don't do that anyway.
|
||||
|
||||
# Now we move on to the input files.
|
||||
old_hashes = {v['filename']: v['sha1']
|
||||
for v in self._state['webidls'].values()}
|
||||
|
||||
old_filenames = set(old_hashes.keys())
|
||||
new_filenames = self._input_paths
|
||||
|
||||
# If an old file has disappeared or a new file has arrived, mark
|
||||
# it.
|
||||
changed_inputs |= old_filenames ^ new_filenames
|
||||
|
||||
# For the files in common between runs, compare content. If the file
|
||||
# has changed, mark it. We don't need to perform mtime comparisons
|
||||
# because content is a stronger validator.
|
||||
for filename in old_filenames & new_filenames:
|
||||
if old_hashes[filename] != self._input_hashes[filename]:
|
||||
changed_inputs.add(filename)
|
||||
|
||||
# We've now populated the base set of inputs that have changed.
|
||||
|
||||
# Inherit dependencies from previous run. The full set of dependencies
|
||||
# is associated with each record, so we don't need to perform any fancy
|
||||
# graph traversal.
|
||||
for v in self._state['webidls'].values():
|
||||
if any(dep for dep in v['inputs'] if dep in changed_inputs):
|
||||
changed_inputs.add(v['filename'])
|
||||
|
||||
# Ensure all changed inputs actually exist (some changed inputs could
|
||||
# have been from deleted files).
|
||||
return set(f for f in changed_inputs if os.path.exists(f))
|
||||
|
||||
def _binding_info(self, p):
|
||||
"""Compute binding metadata for an input path.
|
||||
|
||||
Returns a tuple of:
|
||||
|
||||
(stem, binding_stem, is_event, output_files)
|
||||
|
||||
output_files is itself a tuple. The first two items are the binding
|
||||
header and C++ paths, respectively. The 2nd pair are the event header
|
||||
and C++ paths or None if this isn't an event binding.
|
||||
"""
|
||||
basename = os.path.basename(p)
|
||||
stem = os.path.splitext(basename)[0]
|
||||
binding_stem = '%sBinding' % stem
|
||||
|
||||
if stem in self._exported_stems:
|
||||
header_dir = self._exported_header_dir
|
||||
else:
|
||||
header_dir = self._codegen_dir
|
||||
|
||||
is_event = stem in self._generated_events_stems
|
||||
|
||||
files = (
|
||||
os.path.join(header_dir, '%s.h' % binding_stem),
|
||||
os.path.join(self._codegen_dir, '%s.cpp' % binding_stem),
|
||||
os.path.join(header_dir, '%s.h' % stem) if is_event else None,
|
||||
os.path.join(self._codegen_dir, '%s.cpp' % stem) if is_event else None,
|
||||
)
|
||||
|
||||
return stem, binding_stem, is_event, header_dir, files
|
||||
|
||||
def _example_paths(self, interface):
|
||||
return (
|
||||
os.path.join(self._codegen_dir, '%s-example.h' % interface),
|
||||
os.path.join(self._codegen_dir, '%s-example.cpp' % interface))
|
||||
|
||||
def expected_build_output_files(self):
|
||||
"""Obtain the set of files generate_build_files() should write."""
|
||||
paths = set()
|
||||
|
||||
# Account for global generation.
|
||||
for p in self.GLOBAL_DECLARE_FILES:
|
||||
paths.add(os.path.join(self._exported_header_dir, p))
|
||||
for p in self.GLOBAL_DEFINE_FILES:
|
||||
paths.add(os.path.join(self._codegen_dir, p))
|
||||
|
||||
for p in self._input_paths:
|
||||
stem, binding_stem, is_event, header_dir, files = self._binding_info(p)
|
||||
paths |= {f for f in files if f}
|
||||
|
||||
for interface in self.BUILD_EXAMPLE_INTERFACES:
|
||||
for p in self._example_paths(interface):
|
||||
paths.add(p)
|
||||
|
||||
return paths
|
||||
|
||||
def _generate_build_files_for_webidl(self, filename):
|
||||
self.log(logging.INFO, 'webidl_generate_build_for_input',
|
||||
{'filename': filename},
|
||||
'Generating WebIDL files derived from {filename}')
|
||||
|
||||
stem, binding_stem, is_event, header_dir, files = self._binding_info(filename)
|
||||
root = CGBindingRoot(self._config, binding_stem, filename)
|
||||
|
||||
result = self._maybe_write_codegen(root, files[0], files[1])
|
||||
|
||||
if is_event:
|
||||
generated_event = CGEventRoot(self._config, stem)
|
||||
result = self._maybe_write_codegen(generated_event, files[2],
|
||||
files[3], result)
|
||||
|
||||
return result, root.deps()
|
||||
|
||||
def _global_dependencies_changed(self):
|
||||
"""Determine whether the global dependencies have changed."""
|
||||
current_files = set(iter_modules_in_path(os.path.dirname(__file__)))
|
||||
|
||||
# We need to catch other .py files from /dom/bindings. We assume these
|
||||
# are in the same directory as the config file.
|
||||
current_files |= set(iter_modules_in_path(os.path.dirname(self._config_path)))
|
||||
|
||||
current_files.add(self._config_path)
|
||||
|
||||
current_hashes = {}
|
||||
for f in current_files:
|
||||
# This will fail if the file doesn't exist. If a current global
|
||||
# dependency doesn't exist, something else is wrong.
|
||||
with open(f, 'rb') as fh:
|
||||
current_hashes[f] = hashlib.sha1(fh.read()).hexdigest()
|
||||
|
||||
# The set of files has changed.
|
||||
if current_files ^ set(self._state['global_depends'].keys()):
|
||||
return True, current_hashes
|
||||
|
||||
# Compare hashes.
|
||||
for f, sha1 in current_hashes.items():
|
||||
if sha1 != self._state['global_depends'][f]:
|
||||
return True, current_hashes
|
||||
|
||||
return False, current_hashes
|
||||
|
||||
def _save_state(self):
|
||||
with open(self._state_path, 'wb') as fh:
|
||||
self._state.dump(fh)
|
||||
|
||||
def _maybe_write_codegen(self, obj, declare_path, define_path, result=None):
|
||||
assert declare_path and define_path
|
||||
if not result:
|
||||
result = (set(), set(), set())
|
||||
|
||||
self._maybe_write_file(declare_path, obj.declare(), result)
|
||||
self._maybe_write_file(define_path, obj.define(), result)
|
||||
|
||||
return result
|
||||
|
||||
def _maybe_write_file(self, path, content, result):
|
||||
fh = FileAvoidWrite(path)
|
||||
fh.write(content)
|
||||
existed, updated = fh.close()
|
||||
|
||||
if not existed:
|
||||
result[0].add(path)
|
||||
elif updated:
|
||||
result[1].add(path)
|
||||
else:
|
||||
result[2].add(path)
|
||||
|
||||
|
||||
def create_build_system_manager(topsrcdir, topobjdir, dist_dir):
|
||||
"""Create a WebIDLManager for use by the build system."""
|
||||
src_dir = os.path.join(topsrcdir, 'dom', 'bindings')
|
||||
obj_dir = os.path.join(topobjdir, 'dom', 'bindings')
|
||||
|
||||
with open(os.path.join(obj_dir, 'file-lists.json'), 'rb') as fh:
|
||||
files = json.load(fh)
|
||||
|
||||
inputs = (files['webidls'], files['exported_stems'],
|
||||
files['generated_events_stems'])
|
||||
|
||||
cache_dir = os.path.join(obj_dir, '_cache')
|
||||
try:
|
||||
os.makedirs(cache_dir)
|
||||
except OSError as e:
|
||||
if e.errno != errno.EEXIST:
|
||||
raise
|
||||
|
||||
return WebIDLCodegenManager(
|
||||
os.path.join(src_dir, 'Bindings.conf'),
|
||||
inputs,
|
||||
os.path.join(dist_dir, 'include', 'mozilla', 'dom'),
|
||||
obj_dir,
|
||||
os.path.join(obj_dir, 'codegen.json'),
|
||||
cache_dir=cache_dir,
|
||||
# The make rules include a codegen.pp file containing dependencies.
|
||||
make_deps_path=os.path.join(obj_dir, 'codegen.pp'),
|
||||
make_deps_target='codegen.pp',
|
||||
)
|
||||
|
||||
|
||||
class BuildSystemWebIDL(MozbuildObject):
|
||||
@property
|
||||
def manager(self):
|
||||
if not hasattr(self, '_webidl_manager'):
|
||||
self._webidl_manager = create_build_system_manager(
|
||||
self.topsrcdir, self.topobjdir, self.distdir)
|
||||
|
||||
return self._webidl_manager
|
|
@ -1,3 +0,0 @@
|
|||
interface Child : Parent {
|
||||
void ChildBaz();
|
||||
};
|
|
@ -1,2 +0,0 @@
|
|||
interface DummyInterface {};
|
||||
interface DummyInterfaceWorkers {};
|
|
@ -1,3 +0,0 @@
|
|||
/* These interfaces are hard-coded and need to be defined. */
|
||||
interface TestExampleInterface {};
|
||||
interface TestExampleProxyInterface {};
|
|
@ -1,3 +0,0 @@
|
|||
interface Parent {
|
||||
void MethodFoo();
|
||||
};
|
|
@ -1,13 +0,0 @@
|
|||
interface EventTarget {
|
||||
void addEventListener();
|
||||
};
|
||||
|
||||
interface Event {};
|
||||
|
||||
callback EventHandlerNonNull = any (Event event);
|
||||
typedef EventHandlerNonNull? EventHandler;
|
||||
|
||||
[NoInterfaceObject]
|
||||
interface TestEvent : EventTarget {
|
||||
attribute EventHandler onfoo;
|
||||
};
|
|
@ -1,276 +0,0 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import imp
|
||||
import json
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from mozwebidl import (
|
||||
WebIDLCodegenManager,
|
||||
WebIDLCodegenManagerState,
|
||||
)
|
||||
|
||||
from mozfile import NamedTemporaryFile
|
||||
|
||||
from mozunit import (
|
||||
MockedOpen,
|
||||
main,
|
||||
)
|
||||
|
||||
|
||||
OUR_DIR = os.path.abspath(os.path.dirname(__file__))
|
||||
TOPSRCDIR = os.path.normpath(os.path.join(OUR_DIR, '..', '..', '..', '..'))
|
||||
|
||||
|
||||
class TestWebIDLCodegenManager(unittest.TestCase):
|
||||
TEST_STEMS = {
|
||||
'Child',
|
||||
'Parent',
|
||||
'ExampleBinding',
|
||||
'TestEvent',
|
||||
}
|
||||
|
||||
@property
|
||||
def _static_input_paths(self):
|
||||
s = {os.path.join(OUR_DIR, p) for p in os.listdir(OUR_DIR)
|
||||
if p.endswith('.webidl')}
|
||||
|
||||
return s
|
||||
|
||||
@property
|
||||
def _config_path(self):
|
||||
config = os.path.join(TOPSRCDIR, 'dom', 'bindings', 'Bindings.conf')
|
||||
self.assertTrue(os.path.exists(config))
|
||||
|
||||
return config
|
||||
|
||||
def _get_manager_args(self):
|
||||
tmp = tempfile.mkdtemp()
|
||||
self.addCleanup(shutil.rmtree, tmp)
|
||||
|
||||
cache_dir = os.path.join(tmp, 'cache')
|
||||
os.mkdir(cache_dir)
|
||||
|
||||
ip = self._static_input_paths
|
||||
|
||||
inputs = (
|
||||
ip,
|
||||
{os.path.splitext(os.path.basename(p))[0] for p in ip},
|
||||
set()
|
||||
)
|
||||
|
||||
return dict(
|
||||
config_path=self._config_path,
|
||||
inputs=inputs,
|
||||
exported_header_dir=os.path.join(tmp, 'exports'),
|
||||
codegen_dir=os.path.join(tmp, 'codegen'),
|
||||
state_path=os.path.join(tmp, 'state.json'),
|
||||
make_deps_path=os.path.join(tmp, 'codegen.pp'),
|
||||
make_deps_target='codegen.pp',
|
||||
cache_dir=cache_dir,
|
||||
)
|
||||
|
||||
def _get_manager(self):
|
||||
return WebIDLCodegenManager(**self._get_manager_args())
|
||||
|
||||
def test_unknown_state_version(self):
|
||||
"""Loading a state file with a too new version resets state."""
|
||||
args = self._get_manager_args()
|
||||
|
||||
p = args['state_path']
|
||||
|
||||
with open(p, 'wb') as fh:
|
||||
json.dump({
|
||||
'version': WebIDLCodegenManagerState.VERSION + 1,
|
||||
'foobar': '1',
|
||||
}, fh)
|
||||
|
||||
manager = WebIDLCodegenManager(**args)
|
||||
|
||||
self.assertEqual(manager._state['version'],
|
||||
WebIDLCodegenManagerState.VERSION)
|
||||
self.assertNotIn('foobar', manager._state)
|
||||
|
||||
def test_generate_build_files(self):
|
||||
"""generate_build_files() does the right thing from empty."""
|
||||
manager = self._get_manager()
|
||||
result = manager.generate_build_files()
|
||||
self.assertEqual(len(result.inputs), 5)
|
||||
|
||||
output = manager.expected_build_output_files()
|
||||
self.assertEqual(result.created, output)
|
||||
self.assertEqual(len(result.updated), 0)
|
||||
self.assertEqual(len(result.unchanged), 0)
|
||||
|
||||
for f in output:
|
||||
self.assertTrue(os.path.isfile(f))
|
||||
|
||||
for f in manager.GLOBAL_DECLARE_FILES:
|
||||
self.assertIn(os.path.join(manager._exported_header_dir, f), output)
|
||||
|
||||
for f in manager.GLOBAL_DEFINE_FILES:
|
||||
self.assertIn(os.path.join(manager._codegen_dir, f), output)
|
||||
|
||||
for s in self.TEST_STEMS:
|
||||
self.assertTrue(os.path.isfile(os.path.join(
|
||||
manager._exported_header_dir, '%sBinding.h' % s)))
|
||||
self.assertTrue(os.path.isfile(os.path.join(
|
||||
manager._codegen_dir, '%sBinding.cpp' % s)))
|
||||
|
||||
self.assertTrue(os.path.isfile(manager._state_path))
|
||||
|
||||
with open(manager._state_path, 'rb') as fh:
|
||||
state = json.load(fh)
|
||||
self.assertEqual(state['version'], 1)
|
||||
self.assertIn('webidls', state)
|
||||
|
||||
child = state['webidls']['Child.webidl']
|
||||
self.assertEqual(len(child['inputs']), 2)
|
||||
self.assertEqual(len(child['outputs']), 2)
|
||||
self.assertEqual(child['sha1'], 'c41527cad3bc161fa6e7909e48fa11f9eca0468b')
|
||||
|
||||
def test_generate_build_files_load_state(self):
|
||||
"""State should be equivalent when instantiating a new instance."""
|
||||
args = self._get_manager_args()
|
||||
m1 = WebIDLCodegenManager(**args)
|
||||
self.assertEqual(len(m1._state['webidls']), 0)
|
||||
m1.generate_build_files()
|
||||
|
||||
m2 = WebIDLCodegenManager(**args)
|
||||
self.assertGreater(len(m2._state['webidls']), 2)
|
||||
self.assertEqual(m1._state, m2._state)
|
||||
|
||||
def test_no_change_no_writes(self):
|
||||
"""If nothing changes, no files should be updated."""
|
||||
args = self._get_manager_args()
|
||||
m1 = WebIDLCodegenManager(**args)
|
||||
m1.generate_build_files()
|
||||
|
||||
m2 = WebIDLCodegenManager(**args)
|
||||
result = m2.generate_build_files()
|
||||
|
||||
self.assertEqual(len(result.inputs), 0)
|
||||
self.assertEqual(len(result.created), 0)
|
||||
self.assertEqual(len(result.updated), 0)
|
||||
|
||||
def test_output_file_regenerated(self):
|
||||
"""If an output file disappears, it is regenerated."""
|
||||
args = self._get_manager_args()
|
||||
m1 = WebIDLCodegenManager(**args)
|
||||
m1.generate_build_files()
|
||||
|
||||
rm_count = 0
|
||||
for p in m1._state['webidls']['Child.webidl']['outputs']:
|
||||
rm_count += 1
|
||||
os.unlink(p)
|
||||
|
||||
for p in m1.GLOBAL_DECLARE_FILES:
|
||||
rm_count += 1
|
||||
os.unlink(os.path.join(m1._exported_header_dir, p))
|
||||
|
||||
m2 = WebIDLCodegenManager(**args)
|
||||
result = m2.generate_build_files()
|
||||
self.assertEqual(len(result.created), rm_count)
|
||||
|
||||
def test_only_rebuild_self(self):
|
||||
"""If an input file changes, only rebuild that one file."""
|
||||
args = self._get_manager_args()
|
||||
m1 = WebIDLCodegenManager(**args)
|
||||
m1.generate_build_files()
|
||||
|
||||
child_path = None
|
||||
for p in m1._input_paths:
|
||||
if p.endswith('Child.webidl'):
|
||||
child_path = p
|
||||
break
|
||||
|
||||
self.assertIsNotNone(child_path)
|
||||
child_content = open(child_path, 'rb').read()
|
||||
|
||||
with MockedOpen({child_path: child_content + '\n/* */'}):
|
||||
m2 = WebIDLCodegenManager(**args)
|
||||
result = m2.generate_build_files()
|
||||
self.assertEqual(result.inputs, set([child_path]))
|
||||
self.assertEqual(len(result.updated), 0)
|
||||
self.assertEqual(len(result.created), 0)
|
||||
|
||||
def test_rebuild_dependencies(self):
|
||||
"""Ensure an input file used by others results in others rebuilding."""
|
||||
args = self._get_manager_args()
|
||||
m1 = WebIDLCodegenManager(**args)
|
||||
m1.generate_build_files()
|
||||
|
||||
parent_path = None
|
||||
child_path = None
|
||||
for p in m1._input_paths:
|
||||
if p.endswith('Parent.webidl'):
|
||||
parent_path = p
|
||||
elif p.endswith('Child.webidl'):
|
||||
child_path = p
|
||||
|
||||
self.assertIsNotNone(parent_path)
|
||||
parent_content = open(parent_path, 'rb').read()
|
||||
|
||||
with MockedOpen({parent_path: parent_content + '\n/* */'}):
|
||||
m2 = WebIDLCodegenManager(**args)
|
||||
result = m2.generate_build_files()
|
||||
self.assertEqual(result.inputs, {child_path, parent_path})
|
||||
self.assertEqual(len(result.updated), 0)
|
||||
self.assertEqual(len(result.created), 0)
|
||||
|
||||
def test_python_change_regenerate_everything(self):
|
||||
"""If a Python file changes, we should attempt to rebuild everything."""
|
||||
|
||||
# We don't want to mutate files in the source directory because we want
|
||||
# to be able to build from a read-only filesystem. So, we install a
|
||||
# dummy module and rewrite the metadata to say it comes from the source
|
||||
# directory.
|
||||
#
|
||||
# Hacking imp to accept a MockedFile doesn't appear possible. So for
|
||||
# the first iteration we read from a temp file. The second iteration
|
||||
# doesn't need to import, so we are fine with a mocked file.
|
||||
fake_path = os.path.join(OUR_DIR, 'fakemodule.py')
|
||||
with NamedTemporaryFile('wt') as fh:
|
||||
fh.write('# Original content')
|
||||
fh.flush()
|
||||
mod = imp.load_source('mozwebidl.fakemodule', fh.name)
|
||||
mod.__file__ = fake_path
|
||||
|
||||
args = self._get_manager_args()
|
||||
m1 = WebIDLCodegenManager(**args)
|
||||
with MockedOpen({fake_path: '# Original content'}):
|
||||
old_exists = os.path.exists
|
||||
try:
|
||||
def exists(p):
|
||||
if p == fake_path:
|
||||
return True
|
||||
return old_exists(p)
|
||||
|
||||
os.path.exists = exists
|
||||
|
||||
result = m1.generate_build_files()
|
||||
l = len(result.inputs)
|
||||
|
||||
with open(fake_path, 'wt') as fh:
|
||||
fh.write('# Modified content')
|
||||
|
||||
m2 = WebIDLCodegenManager(**args)
|
||||
result = m2.generate_build_files()
|
||||
self.assertEqual(len(result.inputs), l)
|
||||
|
||||
result = m2.generate_build_files()
|
||||
self.assertEqual(len(result.inputs), 0)
|
||||
finally:
|
||||
os.path.exists = old_exists
|
||||
del sys.modules['mozwebidl.fakemodule']
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -2,18 +2,89 @@
|
|||
# License, v. 2.0. If a copy of the MPL was not distributed with this file,
|
||||
# You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
# Do NOT export this library. We don't actually want our test code
|
||||
# being added to libxul or anything.
|
||||
|
||||
# pymake can't handle descending into dom/bindings several times simultaneously
|
||||
ifdef .PYMAKE
|
||||
.NOTPARALLEL:
|
||||
endif
|
||||
|
||||
# Need this for $(test_webidl_files)
|
||||
include ../webidlsrcs.mk
|
||||
|
||||
# $(test_stems) comes from webidlsrcs.mk.
|
||||
CPPSRCS += $(addprefix ../,$(test_stems))
|
||||
# But the webidl actually lives in our parent dir
|
||||
test_webidl_files := $(addprefix ../,$(test_webidl_files))
|
||||
# Store the actual locations of our source preprocessed files, so we
|
||||
# can depend on them sanely.
|
||||
source_preprocessed_test_webidl_files := $(addprefix $(srcdir)/,$(preprocessed_test_webidl_files))
|
||||
preprocessed_test_webidl_files := $(addprefix ../,$(preprocessed_test_webidl_files))
|
||||
|
||||
# Bug 932082 tracks having bindings use namespaced includes.
|
||||
LOCAL_INCLUDES += -I$(DIST)/include/mozilla/dom -I..
|
||||
CPPSRCS += \
|
||||
$(subst .webidl,Binding.cpp,$(test_webidl_files)) \
|
||||
$(subst .webidl,Binding.cpp,$(preprocessed_test_webidl_files)) \
|
||||
$(NULL)
|
||||
|
||||
# If you change bindinggen_dependencies here, change it in
|
||||
# dom/bindings/Makefile.in too. But note that we include ../Makefile
|
||||
# here manually, since $(GLOBAL_DEPS) won't cover it.
|
||||
bindinggen_dependencies := \
|
||||
../BindingGen.py \
|
||||
../Bindings.conf \
|
||||
../Configuration.py \
|
||||
../Codegen.py \
|
||||
../ParserResults.pkl \
|
||||
../parser/WebIDL.py \
|
||||
../Makefile \
|
||||
$(GLOBAL_DEPS) \
|
||||
$(NULL)
|
||||
|
||||
ifdef GNU_CC
|
||||
CXXFLAGS += -Wno-uninitialized
|
||||
endif
|
||||
|
||||
# Include rules.mk before any of our targets so our first target is coming from
|
||||
# rules.mk and running make with no target in this dir does the right thing.
|
||||
include $(topsrcdir)/config/rules.mk
|
||||
|
||||
$(CPPSRCS): .BindingGen
|
||||
|
||||
.BindingGen: $(bindinggen_dependencies) \
|
||||
$(test_webidl_files) \
|
||||
$(source_preprocessed_test_webidl_files) \
|
||||
$(NULL)
|
||||
# The export phase in dom/bindings is what actually looks at
|
||||
# dependencies and regenerates things as needed, so just go ahead and
|
||||
# make that phase here. Also make our example interface files. If the
|
||||
# target used here ever changes, change the conditional around
|
||||
# $(CPPSRCS) in dom/bindings/Makefile.in.
|
||||
$(MAKE) -C .. export TestExampleInterface-example TestExampleProxyInterface-example
|
||||
@$(TOUCH) $@
|
||||
|
||||
check::
|
||||
PYTHONDONTWRITEBYTECODE=1 $(PYTHON) $(topsrcdir)/config/pythonpath.py \
|
||||
$(PLY_INCLUDE) $(srcdir)/../parser/runtests.py
|
||||
|
||||
# Since we define MOCHITEST_FILES, config/makefiles/mochitest.mk goes ahead and
|
||||
# sets up a rule with libs:: in itm which makes our .DEFAULT_TARGET be "libs".
|
||||
# Then ruls.mk does |.DEFAULT_TARGET ?= default| which leaves it as "libs". So
|
||||
# if we make without an explicit target in this directory, we try to make
|
||||
# "libs", but with a $(MAKECMDGOALS) of empty string. And then rules.mk
|
||||
# helpfully does not include our *.o.pp files, since it includes them only if
|
||||
# filtering some stuff out from $(MAKECMDGOALS) leaves it nonempty. The upshot
|
||||
# is that if some headers change and we run make in this dir without an explicit
|
||||
# target things don't get rebuilt.
|
||||
#
|
||||
# On the other hand, if we set .DEFAULT_TARGET to "default" explicitly here,
|
||||
# then rules.mk will reinvoke make with "export" and "libs" but this time hey
|
||||
# will be passed as explicit targets, show up in $(MAKECMDGOALS), and things
|
||||
# will work. Do this at the end of our Makefile so the rest of the build system
|
||||
# does not get a chance to muck with it after we set it.
|
||||
.DEFAULT_GOAL := default
|
||||
|
||||
# Make sure to add .BindingGen to GARBAGE so we'll rebuild our example
|
||||
# files if someone goes through and deletes GARBAGE all over, which
|
||||
# will delete example files from our parent dir.
|
||||
GARBAGE += \
|
||||
.BindingGen \
|
||||
$(NULL)
|
||||
|
|
|
@ -14,20 +14,9 @@ MOCHITEST_MANIFESTS += ['mochitest.ini']
|
|||
|
||||
MOCHITEST_CHROME_MANIFESTS += ['chrome.ini']
|
||||
|
||||
TEST_WEBIDL_FILES += [
|
||||
'TestDictionary.webidl',
|
||||
'TestJSImplInheritanceGen.webidl',
|
||||
'TestTypedef.webidl',
|
||||
]
|
||||
|
||||
PREPROCESSED_TEST_WEBIDL_FILES += [
|
||||
'TestCodeGen.webidl',
|
||||
'TestExampleGen.webidl',
|
||||
'TestJSImplGen.webidl',
|
||||
]
|
||||
|
||||
LOCAL_INCLUDES += [
|
||||
'/dom/bindings',
|
||||
'/js/xpconnect/src',
|
||||
'/js/xpconnect/wrappers',
|
||||
]
|
||||
|
||||
|
|
|
@ -100,9 +100,12 @@ if CONFIG['MOZ_GAMEPAD']:
|
|||
if CONFIG['MOZ_NFC']:
|
||||
PARALLEL_DIRS += ['nfc']
|
||||
|
||||
# bindings/test is here, because it needs to build after bindings/, and
|
||||
# we build subdirectories before ourselves.
|
||||
TEST_DIRS += [
|
||||
'tests',
|
||||
'imptests',
|
||||
'bindings/test',
|
||||
]
|
||||
|
||||
if CONFIG['MOZ_WIDGET_TOOLKIT'] in ('gtk2', 'cocoa', 'windows', 'android', 'qt', 'os2'):
|
||||
|
|
|
@ -545,6 +545,18 @@ if CONFIG['MOZ_B2G_FM']:
|
|||
'FMRadio.webidl',
|
||||
]
|
||||
|
||||
if CONFIG['ENABLE_TESTS']:
|
||||
TEST_WEBIDL_FILES += [
|
||||
'TestDictionary.webidl',
|
||||
'TestJSImplInheritanceGen.webidl',
|
||||
'TestTypedef.webidl',
|
||||
]
|
||||
PREPROCESSED_TEST_WEBIDL_FILES += [
|
||||
'TestCodeGen.webidl',
|
||||
'TestExampleGen.webidl',
|
||||
'TestJSImplGen.webidl',
|
||||
]
|
||||
|
||||
GENERATED_EVENTS_WEBIDL_FILES = [
|
||||
'BlobEvent.webidl',
|
||||
'CallGroupErrorEvent.webidl',
|
||||
|
|
|
@ -1,17 +0,0 @@
|
|||
# This Source Code Form is subject to the terms of the Mozilla Public
|
||||
# License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
|
||||
|
||||
import sys
|
||||
|
||||
from mozwebidl import BuildSystemWebIDL
|
||||
|
||||
|
||||
def main(argv):
|
||||
"""Perform WebIDL code generation required by the build system."""
|
||||
manager = BuildSystemWebIDL.from_environment().manager
|
||||
manager.generate_build_files()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main(sys.argv[1:]))
|
|
@ -12,14 +12,8 @@ import mozpack.path as mozpath
|
|||
from .base import BuildBackend
|
||||
|
||||
from ..frontend.data import (
|
||||
GeneratedEventWebIDLFile,
|
||||
GeneratedWebIDLFile,
|
||||
PreprocessedTestWebIDLFile,
|
||||
PreprocessedWebIDLFile,
|
||||
TestManifest,
|
||||
TestWebIDLFile,
|
||||
XPIDLFile,
|
||||
WebIDLFile,
|
||||
)
|
||||
|
||||
from ..util import DefaultOnReadDict
|
||||
|
@ -57,80 +51,6 @@ class XPIDLManager(object):
|
|||
self.modules.setdefault(entry['module'], set()).add(entry['root'])
|
||||
|
||||
|
||||
class WebIDLCollection(object):
|
||||
"""Collects WebIDL info referenced during the build."""
|
||||
|
||||
def __init__(self):
|
||||
self.sources = set()
|
||||
self.generated_sources = set()
|
||||
self.generated_events_sources = set()
|
||||
self.preprocessed_sources = set()
|
||||
self.test_sources = set()
|
||||
self.preprocessed_test_sources = set()
|
||||
|
||||
def all_regular_sources(self):
|
||||
return self.sources | self.generated_sources | \
|
||||
self.generated_events_sources | self.preprocessed_sources
|
||||
|
||||
def all_regular_basenames(self):
|
||||
return [os.path.basename(source) for source in self.all_regular_sources()]
|
||||
|
||||
def all_regular_stems(self):
|
||||
return [os.path.splitext(b)[0] for b in self.all_regular_basenames()]
|
||||
|
||||
def all_regular_bindinggen_stems(self):
|
||||
for stem in self.all_regular_stems():
|
||||
yield '%sBinding' % stem
|
||||
|
||||
for source in self.generated_events_sources:
|
||||
yield os.path.splitext(os.path.basename(source))[0]
|
||||
|
||||
def all_regular_cpp_basenames(self):
|
||||
for stem in self.all_regular_bindinggen_stems():
|
||||
yield '%s.cpp' % stem
|
||||
|
||||
def all_test_sources(self):
|
||||
return self.test_sources | self.preprocessed_test_sources
|
||||
|
||||
def all_test_basenames(self):
|
||||
return [os.path.basename(source) for source in self.all_test_sources()]
|
||||
|
||||
def all_test_stems(self):
|
||||
return [os.path.splitext(b)[0] for b in self.all_test_basenames()]
|
||||
|
||||
def all_test_cpp_basenames(self):
|
||||
return ['%sBinding.cpp' % s for s in self.all_test_stems()]
|
||||
|
||||
def all_static_sources(self):
|
||||
return self.sources | self.generated_events_sources | \
|
||||
self.test_sources
|
||||
|
||||
def all_non_static_sources(self):
|
||||
return self.generated_sources | self.preprocessed_sources | \
|
||||
self.preprocessed_test_sources
|
||||
|
||||
def all_non_static_basenames(self):
|
||||
return [os.path.basename(s) for s in self.all_non_static_sources()]
|
||||
|
||||
def all_preprocessed_sources(self):
|
||||
return self.preprocessed_sources | self.preprocessed_test_sources
|
||||
|
||||
def all_sources(self):
|
||||
return set(self.all_regular_sources()) | set(self.all_test_sources())
|
||||
|
||||
def all_basenames(self):
|
||||
return [os.path.basename(source) for source in self.all_sources()]
|
||||
|
||||
def all_stems(self):
|
||||
return [os.path.splitext(b)[0] for b in self.all_basenames()]
|
||||
|
||||
def generated_events_basenames(self):
|
||||
return [os.path.basename(s) for s in self.generated_events_sources]
|
||||
|
||||
def generated_events_stems(self):
|
||||
return [os.path.splitext(b)[0] for b in self.generated_events_basenames()]
|
||||
|
||||
|
||||
class TestManager(object):
|
||||
"""Helps hold state related to tests."""
|
||||
|
||||
|
@ -165,7 +85,6 @@ class CommonBackend(BuildBackend):
|
|||
def _init(self):
|
||||
self._idl_manager = XPIDLManager(self.environment)
|
||||
self._test_manager = TestManager(self.environment)
|
||||
self._webidls = WebIDLCollection()
|
||||
|
||||
def consume_object(self, obj):
|
||||
if isinstance(obj, TestManifest):
|
||||
|
@ -173,44 +92,13 @@ class CommonBackend(BuildBackend):
|
|||
self._test_manager.add(test, flavor=obj.flavor,
|
||||
topsrcdir=obj.topsrcdir)
|
||||
|
||||
elif isinstance(obj, XPIDLFile):
|
||||
if isinstance(obj, XPIDLFile):
|
||||
self._idl_manager.register_idl(obj.source_path, obj.module)
|
||||
|
||||
elif isinstance(obj, WebIDLFile):
|
||||
self._webidls.sources.add(mozpath.join(obj.srcdir, obj.basename))
|
||||
obj.ack()
|
||||
|
||||
elif isinstance(obj, GeneratedEventWebIDLFile):
|
||||
self._webidls.generated_events_sources.add(mozpath.join(
|
||||
obj.srcdir, obj.basename))
|
||||
obj.ack()
|
||||
|
||||
elif isinstance(obj, TestWebIDLFile):
|
||||
self._webidls.test_sources.add(mozpath.join(obj.srcdir,
|
||||
obj.basename))
|
||||
obj.ack()
|
||||
|
||||
elif isinstance(obj, PreprocessedTestWebIDLFile):
|
||||
self._webidls.preprocessed_test_sources.add(mozpath.join(
|
||||
obj.srcdir, obj.basename))
|
||||
obj.ack()
|
||||
|
||||
elif isinstance(obj, GeneratedWebIDLFile):
|
||||
self._webidls.generated_sources.add(mozpath.join(obj.srcdir,
|
||||
obj.basename))
|
||||
obj.ack()
|
||||
|
||||
elif isinstance(obj, PreprocessedWebIDLFile):
|
||||
self._webidls.preprocessed_sources.add(mozpath.join(
|
||||
obj.srcdir, obj.basename))
|
||||
obj.ack()
|
||||
|
||||
def consume_finished(self):
|
||||
if len(self._idl_manager.idls):
|
||||
self._handle_idl_manager(self._idl_manager)
|
||||
|
||||
self._handle_webidl_collection(self._webidls)
|
||||
|
||||
# Write out a machine-readable file describing every test.
|
||||
path = os.path.join(self.environment.topobjdir, 'all-tests.json')
|
||||
with self._write_file(path) as fh:
|
||||
|
|
|
@ -5,15 +5,13 @@
|
|||
from __future__ import unicode_literals
|
||||
|
||||
import itertools
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import types
|
||||
|
||||
from collections import namedtuple
|
||||
|
||||
import mozwebidl
|
||||
|
||||
import mozbuild.makeutil as mozmakeutil
|
||||
from mozpack.copier import FilePurger
|
||||
from mozpack.manifests import (
|
||||
|
@ -27,7 +25,9 @@ from ..frontend.data import (
|
|||
Defines,
|
||||
DirectoryTraversal,
|
||||
Exports,
|
||||
GeneratedEventWebIDLFile,
|
||||
GeneratedInclude,
|
||||
GeneratedWebIDLFile,
|
||||
HeaderFileSubstitution,
|
||||
HostProgram,
|
||||
HostSimpleProgram,
|
||||
|
@ -36,13 +36,17 @@ from ..frontend.data import (
|
|||
JavaJarData,
|
||||
LibraryDefinition,
|
||||
LocalInclude,
|
||||
PreprocessedTestWebIDLFile,
|
||||
PreprocessedWebIDLFile,
|
||||
Program,
|
||||
SandboxDerived,
|
||||
SandboxWrapped,
|
||||
SimpleProgram,
|
||||
TestManifest,
|
||||
TestWebIDLFile,
|
||||
VariablePassthru,
|
||||
XPIDLFile,
|
||||
TestManifest,
|
||||
WebIDLFile,
|
||||
)
|
||||
from ..util import (
|
||||
ensureParentDir,
|
||||
|
@ -264,6 +268,12 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
|
||||
self._backend_files = {}
|
||||
self._ipdl_sources = set()
|
||||
self._webidl_sources = set()
|
||||
self._generated_events_webidl_sources = set()
|
||||
self._test_webidl_sources = set()
|
||||
self._preprocessed_test_webidl_sources = set()
|
||||
self._preprocessed_webidl_sources = set()
|
||||
self._generated_webidl_sources = set()
|
||||
|
||||
def detailed(summary):
|
||||
s = '{:d} total backend files. {:d} created; {:d} updated; {:d} unchanged'.format(
|
||||
|
@ -378,6 +388,33 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
elif isinstance(obj, IPDLFile):
|
||||
self._ipdl_sources.add(mozpath.join(obj.srcdir, obj.basename))
|
||||
|
||||
elif isinstance(obj, WebIDLFile):
|
||||
self._webidl_sources.add(mozpath.join(obj.srcdir, obj.basename))
|
||||
self._process_webidl_basename(obj.basename)
|
||||
|
||||
elif isinstance(obj, GeneratedEventWebIDLFile):
|
||||
self._generated_events_webidl_sources.add(mozpath.join(obj.srcdir, obj.basename))
|
||||
|
||||
elif isinstance(obj, TestWebIDLFile):
|
||||
self._test_webidl_sources.add(mozpath.join(obj.srcdir,
|
||||
obj.basename))
|
||||
# Test WebIDL files are not exported.
|
||||
|
||||
elif isinstance(obj, PreprocessedTestWebIDLFile):
|
||||
self._preprocessed_test_webidl_sources.add(mozpath.join(obj.srcdir,
|
||||
obj.basename))
|
||||
# Test WebIDL files are not exported.
|
||||
|
||||
elif isinstance(obj, GeneratedWebIDLFile):
|
||||
self._generated_webidl_sources.add(mozpath.join(obj.srcdir,
|
||||
obj.basename))
|
||||
self._process_webidl_basename(obj.basename)
|
||||
|
||||
elif isinstance(obj, PreprocessedWebIDLFile):
|
||||
self._preprocessed_webidl_sources.add(mozpath.join(obj.srcdir,
|
||||
obj.basename))
|
||||
self._process_webidl_basename(obj.basename)
|
||||
|
||||
elif isinstance(obj, Program):
|
||||
self._process_program(obj.program, backend_file)
|
||||
|
||||
|
@ -551,9 +588,6 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
poison_windows_h=False):
|
||||
files_per_unified_file = 16
|
||||
|
||||
# In case it's a generator.
|
||||
files = sorted(files)
|
||||
|
||||
explanation = "\n" \
|
||||
"# We build files in 'unified' mode by including several files\n" \
|
||||
"# together into a single source file. This cuts down on\n" \
|
||||
|
@ -579,7 +613,7 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
return itertools.izip_longest(fillvalue=dummy_fill_value, *args)
|
||||
|
||||
for i, unified_group in enumerate(grouper(files_per_unified_file,
|
||||
files)):
|
||||
sorted(files))):
|
||||
just_the_filenames = list(filter_out_dummy(unified_group))
|
||||
yield '%s%d.%s' % (unified_prefix, i, unified_suffix), just_the_filenames
|
||||
|
||||
|
@ -686,10 +720,42 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
with self._write_file(os.path.join(ipdl_dir, 'ipdlsrcs.mk')) as ipdls:
|
||||
mk.dump(ipdls, removal_guard=False)
|
||||
|
||||
# These contain autogenerated sources that the build config doesn't
|
||||
# yet know about.
|
||||
self._may_skip['compile'] -= {'ipc/ipdl'}
|
||||
self._may_skip['compile'] -= {'dom/bindings', 'dom/bindings/test'}
|
||||
self._may_skip['compile'] -= set(['ipc/ipdl'])
|
||||
|
||||
# Write out master lists of WebIDL source files.
|
||||
bindings_dir = os.path.join(self.environment.topobjdir, 'dom', 'bindings')
|
||||
|
||||
mk = mozmakeutil.Makefile()
|
||||
|
||||
def write_var(variable, sources):
|
||||
files = [os.path.basename(f) for f in sorted(sources)]
|
||||
mk.add_statement('%s += %s' % (variable, ' '.join(files)))
|
||||
write_var('webidl_files', self._webidl_sources)
|
||||
write_var('generated_events_webidl_files', self._generated_events_webidl_sources)
|
||||
write_var('test_webidl_files', self._test_webidl_sources)
|
||||
write_var('preprocessed_test_webidl_files', self._preprocessed_test_webidl_sources)
|
||||
write_var('generated_webidl_files', self._generated_webidl_sources)
|
||||
write_var('preprocessed_webidl_files', self._preprocessed_webidl_sources)
|
||||
|
||||
all_webidl_files = itertools.chain(iter(self._webidl_sources),
|
||||
iter(self._generated_events_webidl_sources),
|
||||
iter(self._generated_webidl_sources),
|
||||
iter(self._preprocessed_webidl_sources))
|
||||
all_webidl_files = [os.path.basename(x) for x in all_webidl_files]
|
||||
all_webidl_sources = [re.sub(r'\.webidl$', 'Binding.cpp', x) for x in all_webidl_files]
|
||||
|
||||
self._add_unified_build_rules(mk, all_webidl_sources,
|
||||
bindings_dir,
|
||||
unified_prefix='UnifiedBindings',
|
||||
unified_files_makefile_variable='unified_binding_cpp_files',
|
||||
poison_windows_h=True)
|
||||
|
||||
# Assume that Somebody Else has responsibility for correctly
|
||||
# specifying removal dependencies for |all_webidl_sources|.
|
||||
with self._write_file(os.path.join(bindings_dir, 'webidlsrcs.mk')) as webidls:
|
||||
mk.dump(webidls, removal_guard=False)
|
||||
|
||||
self._may_skip['compile'] -= set(['dom/bindings', 'dom/bindings/test'])
|
||||
|
||||
self._fill_root_mk()
|
||||
|
||||
|
@ -942,6 +1008,10 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
def _process_host_simple_program(self, program, backend_file):
|
||||
backend_file.write('HOST_SIMPLE_PROGRAMS += %s\n' % program)
|
||||
|
||||
def _process_webidl_basename(self, basename):
|
||||
header = 'mozilla/dom/%sBinding.h' % os.path.splitext(basename)[0]
|
||||
self._install_manifests['dist_include'].add_optional_exists(header)
|
||||
|
||||
def _process_test_manifest(self, obj, backend_file):
|
||||
# Much of the logic in this function could be moved to CommonBackend.
|
||||
self.backend_input_files.add(os.path.join(obj.topsrcdir,
|
||||
|
@ -1034,80 +1104,3 @@ class RecursiveMakeBackend(CommonBackend):
|
|||
|
||||
for manifest in sorted(manifests):
|
||||
master.write('[include:%s]\n' % manifest)
|
||||
|
||||
def _handle_webidl_collection(self, webidls):
|
||||
if not webidls.all_stems():
|
||||
return
|
||||
|
||||
bindings_dir = os.path.join(self.environment.topobjdir, 'dom',
|
||||
'bindings')
|
||||
|
||||
all_inputs = set(webidls.all_static_sources())
|
||||
for s in webidls.all_non_static_basenames():
|
||||
all_inputs.add(os.path.join(bindings_dir, s))
|
||||
|
||||
generated_events_stems = webidls.generated_events_stems()
|
||||
exported_stems = webidls.all_regular_stems()
|
||||
|
||||
# The WebIDL manager reads configuration from a JSON file. So, we
|
||||
# need to write this file early.
|
||||
o = dict(
|
||||
webidls=sorted(all_inputs),
|
||||
generated_events_stems=sorted(generated_events_stems),
|
||||
exported_stems=sorted(exported_stems),
|
||||
)
|
||||
|
||||
file_lists = os.path.join(bindings_dir, 'file-lists.json')
|
||||
with self._write_file(file_lists) as fh:
|
||||
json.dump(o, fh, sort_keys=True)
|
||||
|
||||
manager = mozwebidl.create_build_system_manager(
|
||||
self.environment.topsrcdir,
|
||||
self.environment.topobjdir,
|
||||
os.path.join(self.environment.topobjdir, 'dist')
|
||||
)
|
||||
|
||||
# The manager is the source of truth on what files are generated.
|
||||
# Consult it for install manifests.
|
||||
include_dir = os.path.join(self.environment.topobjdir, 'dist',
|
||||
'include')
|
||||
for f in manager.expected_build_output_files():
|
||||
if f.startswith(include_dir):
|
||||
self._install_manifests['dist_include'].add_optional_exists(
|
||||
f[len(include_dir)+1:])
|
||||
|
||||
# We pass WebIDL info to make via a completely generated make file.
|
||||
mk = Makefile()
|
||||
mk.add_statement('nonstatic_webidl_files := %s' % ' '.join(
|
||||
sorted(webidls.all_non_static_basenames())))
|
||||
mk.add_statement('globalgen_sources := %s' % ' '.join(
|
||||
sorted(manager.GLOBAL_DEFINE_FILES)))
|
||||
mk.add_statement('test_sources := %s' % ' '.join(
|
||||
sorted('%sBinding.cpp' % s for s in webidls.all_test_stems())))
|
||||
|
||||
# Add rules to preprocess bindings.
|
||||
for source in sorted(webidls.all_preprocessed_sources()):
|
||||
basename = os.path.basename(source)
|
||||
rule = mk.create_rule([basename])
|
||||
rule.add_dependencies([source, '$(GLOBAL_DEPS)'])
|
||||
rule.add_commands([
|
||||
# Remove the file before writing so bindings that go from
|
||||
# static to preprocessed don't end up writing to a symlink,
|
||||
# which would modify content in the source directory.
|
||||
'$(RM) $@',
|
||||
'$(call py_action,preprocessor,$(DEFINES) $(ACDEFINES) '
|
||||
'$(XULPPFLAGS) $< -o $@)'
|
||||
])
|
||||
|
||||
# Bindings are compiled in unified mode to speed up compilation and
|
||||
# to reduce linker memory size. Note that test bindings are separated
|
||||
# from regular ones so tests bindings aren't shipped.
|
||||
self._add_unified_build_rules(mk,
|
||||
webidls.all_regular_cpp_basenames(),
|
||||
bindings_dir,
|
||||
unified_prefix='UnifiedBindings',
|
||||
unified_files_makefile_variable='unified_binding_cpp_files')
|
||||
|
||||
webidls_mk = os.path.join(bindings_dir, 'webidlsrcs.mk')
|
||||
with self._write_file(webidls_mk) as fh:
|
||||
mk.dump(fh, removal_guard=False)
|
||||
|
|
|
@ -275,10 +275,6 @@ class MozbuildObject(ProcessExecutionMixin):
|
|||
def bindir(self):
|
||||
return os.path.join(self.topobjdir, 'dist', 'bin')
|
||||
|
||||
@property
|
||||
def includedir(self):
|
||||
return os.path.join(self.topobjdir, 'dist', 'include')
|
||||
|
||||
@property
|
||||
def statedir(self):
|
||||
return os.path.join(self.topobjdir, '.mozbuild')
|
||||
|
|
Загрузка…
Ссылка в новой задаче