The crash reporter symbol files are the easiest cross-platform way to
find static initializers. While some types of static initializers (e.g.
__attribute__(constructor) functions) don't appear there in a notable
way, the static initializers we do care the most about for tracking do
(static initializers from C++ globals). As a matter of fact, there is
only a difference of 2 compared to the currently reported count of 125
on a linux64 build, so this is a good enough approximation. And allows
us to easily track the count on Android, OSX and Windows builds, which
we currently don't do.
The tricky part is that the symbol files are in
dist/crashreporter-symbols/$lib/$fileid/$lib.sym, and $fileid is hard to
figure out. There is a `fileid` tool in testing/tools, but it is a
target binary, meaning it's not available on cross builds (OSX,
Android).
So the simplest is just to gather the data while creating the symbol
files, which unfortunately requires to go through some hoops to make it
happen for just the files we care about.
--HG--
extra : rebase_source : 458fed1ffd6f9294eefef61f10ff7a284af0d986
This one looks to be pretty straight-forward. It irritates me that
the jar.mn entry doesn't explicitly say that the result is coming from
the object directory, like
locale/browser/bookmarks.html (!bookmarks.html)
but that's for another day.
MozReview-Commit-ID: Cw8E0VJhSxv
--HG--
extra : rebase_source : a1045a5b564b0094b562729bc7234e69ec7a786d
Our bundled Hunspell now significantly differs from upstream Hunspell. Most
importantly, it supports loading dictionaries from jar: URIs, which is now a
requirement for loading bundled and extension dictionaries. This means that
system Hunspell libraries are no longer compatible with our spell checker
code. We should remove the option to use them so that users don't fall into
the trap of trying to use them.
MozReview-Commit-ID: 2ihJe6YOnGf
--HG--
extra : rebase_source : ceb091b9475a2b101156405a02a60015fc36da17
Original patch author is Takuro Ashie <ashie@clear-code.com>
Provide ability to create native EGL window and provide it under NS_NATIVE_EGL_WINDOW
to GL code. The native EGL window is owned/managed by mozcontainer.
MozReview-Commit-ID: 4d0Kk6DRSaD
--HG--
extra : rebase_source : e4677ce51fbf918eb1b0257c66ca4b7220174bbb
The build system knows at build-backend time where to find each IDL
file; making xpidl-process.py rediscover this by requiring
xpidl-process.py to search through directories to find input IDL files
is silly. To rememdy this, we're going to modify things so full paths
are passed into the script. Those paths can then be used directly, with
no searching.
The tail end of the xpidl Makefile.in contains a line, generated for
every xpt file:
$(1): $(addsuffix .idl,$(addprefix $(dist_idl_dir)/,$($(basename $(notdir $(1)))_deps)))
This line, in context, is saying that the xpt file depends on all of its
input IDL files. But xpidl-process.py already generates this
information when we pass it --depsdir, which we do. So this code is
redundant with what we already generate, and it can be removed.
The previous patch required us to pass a single -I argument pointing at
$(DIST)/idl so IDL include statements would work correctly. This patch
lifts that limitation and explicitly points xpidl-process.py at the
locations of all the IDL source directories to search for included IDL
files. Invocations of xpidl-process.py no longer depend on IDL files
being copied to the objdir.
Building on the last patch, we can change the build process to pass in
the directories where the input IDL files can be found. It is
convenient to pass in just the relative source directory paths, to
encourage people to not look in the object directory and to make the
command lines slightly shorter.
xpidl-process.py still assumes that included IDL files can be found by
looking in a single directory. We add a single -I argument to the
invocation of xpidl-process.py to accommodate this short-sightedness.
The current IDL build setup assumes that all IDL files can be found in a
single directory. This setup requires that all IDL files be copied to a
single directory, which is suboptimal in terms of disk I/O and also
complicates things like generating IDL files at build time.
As a first step in moving away from this state of affairs,
xpidl-process.py needs to be taught that the input IDL files could
potentially be found in multiple directories. The current setup can
just specify $(DIST)/idl as the lone directory to examine. Future
patches will change this to examine multiple directories.
The make backend was treating the first output of a GENERATED_FILES rule
specially, since it was the target of the rule containing the script
invocation. We want the outputs of GENERATED_FILES rules to be
FileAvoidWrite so that we avoid triggering downstream rules if the
outputs are unchanged, but if the target of the script invocation is
FileAvoidWrite, then make may continually re-run the script during a
no-op build.
The solution here is to use a stub file as the target of the script
invocation which will always be touched when the script runs. Since
nothing else in the build depends on the stub, we don't need to
FileAvoidWrite it. All actual outputs of the script can be
FileAvoidWrite, and make can properly avoid work for files that haven't
changed.
MozReview-Commit-ID: 3GejZw2tpqu
--HG--
extra : rebase_source : 2b9be82f893e89a4c2f254f05b1e8b9a0f9c631b
I don't understand how this will interact with the parts of the build
where we try to avoid installing the dist/bin manifest, but this makes
sense to me and it works locally for mobile/android and for browser/.
MozReview-Commit-ID: L7RtA4K3WrX
--HG--
extra : rebase_source : 3c08a5aab5398eb3b5685b18e5fe06e926db5f85
We want annotationProcessors to be compiled and archived into a JAR at
build time, ready to generate JNI wrappers. (That is, until we turn
the whole thing into a real annotation processor.) But even if we do
use a real annotation processor, we still need to generate SDK
bindings, which is less clearly expressed as an annotation processor.
(It's more of a build step.)
Gradle provides a huge number of ways to organize build logic to
achieve this: see
https://docs.gradle.org/current/userguide/organizing_build_logic.html.
Unfortunately, the best such way -- putting the code into
$topsrcdir/buildSrc -- has key disadvantages:
1) it pollutes the top-level $topsrcdir, and there's no way to change the
location of buildSrc (https://github.com/gradle/gradle/issues/2472);
2) it's complicated to have a dependent project
(mobile/android/annotations) expose its code via a buildSrc project;
3) using buildSrc at all appears to conflict with the Android-Gradle
plugin version that we are using.
Therefore, this commit does something much simpler: it adds a
Java-only project and uses the resulting Gradle "Jar" task and archive
output as input to the existing Gradle "generate JNI wrappers" task.
MozReview-Commit-ID: 2OyYLPneE1M
--HG--
extra : rebase_source : d99b74a0a1e0bb3e8f4d4540978328388e5c2e42
Some content in Makefile.in is removed because after this change, the
scripts no longer invoke the preprocessor and thus don't have unknown
dependencies anymore outside what is provided in their inputs array.
The order of exports.PREFERENCES in properties-db changes because the
data file has shorthands placed after longhands. The only usage of it
is in test_css-properties-db.js which doesn't care about the order.
MozReview-Commit-ID: AMjzTRf2HYN
--HG--
extra : rebase_source : 7976e48e7c7bba467d77a34ab0d7709cde1ecdf4
We add a minimal Python script to run a process and prefix all its
output with a string. We change the automation tiers to evaluate all
make targets using this script.
MozReview-Commit-ID: 79g5KUd5ked
--HG--
extra : rebase_source : 63388a71b51e5abc05ca8bd48e180af72bf799e6
Some content in Makefile.in is removed because after this change, the
scripts no longer invoke the preprocessor and thus don't have unknown
dependencies anymore outside what is provided in their inputs array.
The order of exports.PREFERENCES in properties-db changes because the
data file has shorthands placed after longhands. The only usage of it
is in test_css-properties-db.js which doesn't care about the order.
MozReview-Commit-ID: AMjzTRf2HYN
--HG--
extra : rebase_source : f9db0659a81bea28b335806ac70e23dc0d36e493
This patch contains the meat of the changes here. The following summarize the changes:
1. xptinfo.h is rewritten to expose the new interface for reading the XPT data,
The nsXPTInterfaceInfo object exposes methods with the same signatures as
the methods on nsIInterfaceInfo, to make converting code which used
nsIInterfaceInfo as easy as possible, even when those methods don't have
signatures which make a ton of sense anymore. There are also a few methods
which are unnecessary (they return `true` or similar), which should be
removed over time.
Members of the data structures are made private in order to prevent reading
them directly. Code should instead call the getter methods. This should make
it easier to change their memory representation in the future. Constructing
these structs is made possible by making the structs `friend class` with the
XPTConstruct class, which is implemented by the code generator, and is able
to access the private fields.
In addition, rather than using integers with flag constants, I opted for
using C++ bitfields to store individual flags, as I found it made it easier
to both write the code generator, and reason about the layouts of the types.
I was able to shave a byte off of each nsXPTParamInfo (4 bytes -> 3 bytes)
by shoving the flags into spare bits in the nsXPTType. Unfortunately there
was not enough room for the retval flag. Fortunately, we already depend in
our code on the retval parameter being the last parameter, so I worked
around this by removing the retval flag and instead having a `hasretval`
flag on the method itself.
2. An xptinfo.cpp file is added for out-of-line definitions of more complex
methods, and the internal implementation details of the perfect hash.
Notable is the handling of xptshim interfaces. As the type is uniform, a
flag is checked when trying to read constant information, and a different
table with pointers into webidl data structures is checked when the type is
determined to be a shim.
Ideally we could remove this once we remove the remaining consumers of the
existing shim interfaces.
3. A python code generator which takes in the json XPT files generated in the
previous part, and emits a xptdata.cpp file with the data structures. I did
my best to heavily comment the code.
This code uses the friend class trick to construct the private fields of the
structs, and avoid a dependency on the ordering of fields in xptinfo.h.
The sInterfaces array's order is determined by a generated perfect hash
which is also written into the binary. This should allow for fast lookups by
IID or name of interfaces in memory. The hash function used for the perfect
hash is a simple FNV hash, as they're pretty fast.
For perfect hashing of names, another table is created which contains
indexes into the sInterfaces table. Lookup by name is less common, and this
form of lookup should still be very fast.
4. The necessary Makefiles are updated to use the new code generator, and
generate the file correctly.
The uninstaller was being built as a side-effect of building `setup.exe`. In
Bug 1385227, that was moved from "somewhere" to part of the windows installer
packaging, which happens after the zip and mar are generated. Since the
installer we ship is actually repackaged from the zip[1], we stopped shipping
translated uninstallers.
This changes things around so that the uninstaller gets translated:
- Explicitly build the uninstaller as part of the L10n repack step.
- Use the same logic to build the installer locally as we do to create the ones
we ship.
[1] Except on Thunderbird
Differential Revision: https://phabricator.services.mozilla.com/D672
--HG--
extra : rebase_source : 05fe935c1d2a9fbfeef786819bfe5913ed8ef862
extra : source : d6bf22099e2195dcb64c3c3d7700d3edd0850a3a
The uninstaller was being built as a side-effect of building `setup.exe`. In
Bug 1385227, that was moved from "somewhere" to part of the windows installer
packaging, which happens after the zip and mar are generated. Since the
installer we ship is actually repackaged from the zip[1], we stopped shipping
translated uninstallers.
This changes things around so that the uninstaller gets translated:
- Explicitly build the uninstaller as part of the L10n repack step.
- Use the same logic to build the installer locally as we do to create the ones
we ship.
[1] Except on Thunderbird
Differential Revision: https://phabricator.services.mozilla.com/D672
--HG--
extra : rebase_source : 2b28b9ff7196d12f4a188c8dddf750b9a5efac5b
extra : histedit_source : 9bc28891950ae8c226cfdefef6f8121ce0b51f58