mirror of
https://github.com/dart-lang/sdk
synced 2024-10-06 15:09:45 +00:00
[vm/aot] Add machine readable precompiler trace
This CL adds --trace-precompiler-to option which generates a machine readable precompiler trace (list of all compiled functions and their dependencies). It also expands package:vm_snapshot_analysis with tools for reading and analysing this trace. For example snapshot_analysis explain dynamic-calls command allows to list all dynamic calls sorted by their impact on the size of the AOT snapshot. Issue https://github.com/dart-lang/sdk/issues/41249 Cq-Include-Trybots: luci.dart.try:pkg-linux-debug-try,pkg-linux-release-try,pkg-win-release-try,pkg-mac-release-try Change-Id: Ie49143f4da375067991991e2ad20a41ec67bb1c3 Reviewed-on: https://dart-review.googlesource.com/c/sdk/+/152851 Commit-Queue: Vyacheslav Egorov <vegorov@google.com> Reviewed-by: Martin Kustermann <kustermann@google.com>
This commit is contained in:
parent
3ccf7ce5d2
commit
69ec1a9965
|
@ -134,6 +134,7 @@ compiler/test/deferred/load_mapping_test: Slow, Pass
|
|||
compiler/test/end_to_end/dart2js_batch_test: Slow, Pass
|
||||
compiler/test/end_to_end/exit_code_test: Slow, Pass
|
||||
compiler/test/end_to_end/in_user_code_test: Slow, Pass
|
||||
vm_snapshot_analysis/test/precompiler_trace_test: SkipSlow
|
||||
|
||||
[ $runtime == dart_precompiled ]
|
||||
*: SkipByDesign # The pkg test framework imports dart:mirrors.
|
||||
|
|
|
@ -4,6 +4,14 @@
|
|||
|
||||
- Add `buildComparisonTreemap` for constructing treemap representing the diff
|
||||
between two size profiles.
|
||||
- Implemented support for extracting call graph information from the AOT
|
||||
compiler trace (`--trace-precompiler-to` flag), see `precompiler_trace.dart`.
|
||||
- New command `explain dynamic-calls` which estimates the impact of different
|
||||
dynamic calls on the resulting AOT snapshot size using information from the
|
||||
size dump (e.g. V8 snapshot profile) and AOT compiler trace.
|
||||
- `summary` command can now use information from the AOT compiler trace to
|
||||
group packages/libraries together with their dependencies to given more precise
|
||||
estimate of how much a specific package/library brings into the snapshot.
|
||||
|
||||
## 0.3.0
|
||||
|
||||
|
|
|
@ -2,7 +2,8 @@
|
|||
|
||||
This package provides libraries and a utility for analysing the size and
|
||||
contents of Dart VM AOT snapshots based on the output of
|
||||
`--print-instructions-sizes-to` and `--write-v8-snapshot-profile-to` VM flags.
|
||||
`--print-instructions-sizes-to`, `--write-v8-snapshot-profile-to` and
|
||||
`--trace-precompiler-to` VM flags.
|
||||
|
||||
## AOT Snapshot Basics
|
||||
|
||||
|
@ -24,6 +25,9 @@ half of the snapshot, those this varies depending on the application.
|
|||
* `--write-v8-snapshot-profile-to` is a graph representation of the snapshot,
|
||||
it attributes bytes written into a snapshot to a node in the heap graph. This
|
||||
format covers both data and code sections of the snapshot.
|
||||
* `--trace-precompiler-to` gives information about dependencies between
|
||||
compiled functions, allowing to determine why certain function was pulled into
|
||||
the snapshot.
|
||||
|
||||
### Passing flags to the AOT compiler
|
||||
|
||||
|
@ -92,6 +96,45 @@ Here objects which can be attributed to `_Uri` take `5.7%` of the snapshot,
|
|||
at the same time objects which can be attributed to `dart:core` library
|
||||
but not to any specific class within this library take `3.33%` of the snapshot.
|
||||
|
||||
This command also supports _estimating_ cumulative impact of a library or a
|
||||
package together with its dependencies - which can be computed from
|
||||
a precompiler trace (`--trace-precompiler-to` options). For example:
|
||||
|
||||
```console
|
||||
$ snapshot_analysis summary -b package /tmp/profile.json
|
||||
+-----------------------------+--------------+---------+
|
||||
| Package | Size (Bytes) | Percent |
|
||||
+-----------------------------+--------------+---------+
|
||||
| package:compiler | 5369933 | 38.93% |
|
||||
| package:front_end | 2644942 | 19.18% |
|
||||
| package:kernel | 1443568 | 10.47% |
|
||||
| package:_fe_analyzer_shared | 944555 | 6.85% |
|
||||
...
|
||||
$ snapshot_analysis summary -b package -d 1 --precompiler-trace=/tmp/trace.json /tmp/profile.json
|
||||
+------------------------------+--------------+---------+
|
||||
| Package | Size (Bytes) | Percent |
|
||||
+------------------------------+--------------+---------+
|
||||
| package:compiler (+ 8 deps) | 5762761 | 41.78% |
|
||||
| package:front_end (+ 1 deps) | 2708981 | 19.64% |
|
||||
| package:kernel | 1443568 | 10.47% |
|
||||
| package:_fe_analyzer_shared | 944555 | 6.85% |
|
||||
...
|
||||
Dependency trees:
|
||||
|
||||
package:compiler (total 5762761 bytes)
|
||||
├── package:js_ast (total 242490 bytes)
|
||||
├── package:dart2js_info (total 101280 bytes)
|
||||
├── package:crypto (total 27434 bytes)
|
||||
│ ├── package:typed_data (total 11850 bytes)
|
||||
│ └── package:convert (total 5185 bytes)
|
||||
├── package:collection (total 15182 bytes)
|
||||
├── package:_js_interop_checks (total 4627 bytes)
|
||||
└── package:js_runtime (total 1815 bytes)
|
||||
|
||||
package:front_end (total 2708981 bytes)
|
||||
└── package:package_config (total 64039 bytes)
|
||||
```
|
||||
|
||||
|
||||
### `compare`
|
||||
|
||||
|
@ -141,6 +184,31 @@ method) producing easy to consume output.
|
|||
data or executable code.
|
||||
* `object-type` (default) collapses snapshot nodes based on their type only.
|
||||
|
||||
### `explain`
|
||||
|
||||
#### `explain dynamic-calls`
|
||||
|
||||
```console
|
||||
$ snapshot_analysis explain dynamic-calls <profile.json> <trace.json>
|
||||
```
|
||||
|
||||
This command generates a report listing dynamically dispatched selectors
|
||||
and their approximate impact on the code size.
|
||||
|
||||
```console
|
||||
snapshot_analysis explain dynamic-calls /tmp/profile.json /tmp/trace.json
|
||||
+------------------------------+--------------+---------+----------+
|
||||
| Selector | Size (Bytes) | Percent | Of total |
|
||||
+------------------------------+--------------+---------+----------+
|
||||
| set:requestHeader | 10054 | 28.00% | 0.03% |
|
||||
| get:scale | 3630 | 10.11% | 0.01% |
|
||||
...
|
||||
Dynamic call to set:requestHeader (retaining ~10054 bytes) occurs in:
|
||||
package:my-super-app/src/injector.dart::Injector.handle{body}
|
||||
|
||||
Dynamic call to get:scale (retaining ~3630 bytes) occurs in:
|
||||
package:some-dependency/src/image.dart::Image.==
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
|
@ -162,6 +230,48 @@ to packages, libraries, classes and functions.
|
|||
and creates `ProgramInfo` in an appropriate way, allowing to write code
|
||||
which works in the same way with both formats.
|
||||
|
||||
## Precompiler Trace Format (`--write-precompiler-trace-to=...`)
|
||||
|
||||
AOT compiler can produce a JSON file containing information about compiled
|
||||
functions and dependencies between them. This file has the following structure:
|
||||
|
||||
```json
|
||||
{
|
||||
"trace": traceArray,
|
||||
"entities": entitiesArray,
|
||||
"strings": stringsArray,
|
||||
}
|
||||
```
|
||||
|
||||
- `stringsArray` is an array of strings referenced by other parts of the trace
|
||||
by their index in this array.
|
||||
- `entitiesArray` is an flattened array of entities:
|
||||
|
||||
- `"C", <library-uri-idx>, <class-name-idx>, 0` - class record;
|
||||
- `"V", <class-idx>, <name-idx>, 0` - static field record;
|
||||
- `"F"|"S", <class-idx>, <name-idx>, <selector-id>` - function record (`F` for dynamic functions and `S` for static functions);
|
||||
|
||||
Note that all records in this array occupy the same amount of elements (`4`)
|
||||
to make random access by index possible.
|
||||
|
||||
- `traceArray` is an flattened array of precompilation events:
|
||||
|
||||
- `"R"` - root event (always the first element)
|
||||
- `"E"` - end event (always the last element)
|
||||
- `"C", <function-idx>` - function compilation event
|
||||
|
||||
Root and function compilation events can additionally be followed by a
|
||||
sequence of references which enumerate outgoing dependencies discovered
|
||||
by the AOT compiler:
|
||||
|
||||
- `<entity-idx>` - a reference to a function or a static field;
|
||||
- `"S", <selector-idx>` - a dynamic call with the given selector;
|
||||
- `"T", <selector-id>` - dispatch table call with the given selector id;
|
||||
|
||||
*Flattened array* is an array of records formed by consecutive elements:
|
||||
`[R0_0, R0_1, R0_2, R1_0, R1_1, R1_2, ...]` here `R0_*` is the first record
|
||||
and `R1_*` is the second record and so on.
|
||||
|
||||
## Features and bugs
|
||||
|
||||
Please file feature requests and bugs at the [issue tracker][tracker].
|
||||
|
|
|
@ -8,6 +8,7 @@ import 'dart:io';
|
|||
import 'package:args/command_runner.dart';
|
||||
|
||||
import 'package:vm_snapshot_analysis/src/commands/compare.dart';
|
||||
import 'package:vm_snapshot_analysis/src/commands/explain.dart';
|
||||
import 'package:vm_snapshot_analysis/src/commands/summary.dart';
|
||||
import 'package:vm_snapshot_analysis/src/commands/treemap.dart';
|
||||
|
||||
|
@ -34,7 +35,8 @@ final runner = CommandRunner(
|
|||
_executableName, 'Tools for binary size analysis of Dart VM AOT snapshots.')
|
||||
..addCommand(TreemapCommand())
|
||||
..addCommand(CompareCommand())
|
||||
..addCommand(SummaryCommand());
|
||||
..addCommand(SummaryCommand())
|
||||
..addCommand(ExplainCommand());
|
||||
|
||||
void main(List<String> args) async {
|
||||
try {
|
||||
|
|
|
@ -55,6 +55,23 @@ class Name {
|
|||
return result;
|
||||
}
|
||||
|
||||
/// Split raw name into individual '.' separated components (e.g. names of
|
||||
/// its parent functions).
|
||||
List<String> get rawComponents {
|
||||
// Break the rest of the name into components.
|
||||
final result = raw.split('.');
|
||||
|
||||
// Constructor names look like this 'new <ClassName>.<CtorName>' so
|
||||
// we need to concatenate the first two components back to form
|
||||
// the constructor name.
|
||||
if (result.first.startsWith('new ')) {
|
||||
result[0] = '${result[0]}.${result[1]}';
|
||||
result.removeAt(1);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
static String collapse(String name) =>
|
||||
name.replaceAll(_collapseRe, '<anonymous closure>');
|
||||
}
|
||||
|
|
524
pkg/vm_snapshot_analysis/lib/precompiler_trace.dart
Normal file
524
pkg/vm_snapshot_analysis/lib/precompiler_trace.dart
Normal file
|
@ -0,0 +1,524 @@
|
|||
// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
|
||||
// for details. All rights reserved. Use of this source code is governed by a
|
||||
// BSD-style license that can be found in the LICENSE file.
|
||||
|
||||
/// Helpers for working with the output of `--trace-precompiler-to` VM flag.
|
||||
library vm_snapshot_analysis.precompiler_trace;
|
||||
|
||||
import 'dart:io';
|
||||
import 'dart:math' as math;
|
||||
|
||||
import 'package:vm_snapshot_analysis/name.dart';
|
||||
import 'package:vm_snapshot_analysis/program_info.dart';
|
||||
import 'package:vm_snapshot_analysis/utils.dart';
|
||||
|
||||
/// Build [CallGraph] based on the trace written by `--trace-precompiler-to`
|
||||
/// flag.
|
||||
Future<CallGraph> loadTrace(File input) async =>
|
||||
_TraceReader(await loadJson(input)).readTrace();
|
||||
|
||||
/// [CallGraphNode] represents a node of the call-graph. It can either be:
|
||||
///
|
||||
/// - a function, in which case [data] will be [ProgramInfoNode] of type
|
||||
/// [NodeType.functionNode];
|
||||
/// - a dynamic call node, in which case [data] will be a [String] selector;
|
||||
/// - a dispatch table call node, in which case [data] will be an [int]
|
||||
/// selector id.
|
||||
///
|
||||
class CallGraphNode {
|
||||
/// An index of this node in [CallGraph.nodes].
|
||||
final int id;
|
||||
|
||||
/// Successors of this node.
|
||||
final List<CallGraphNode> succ = [];
|
||||
|
||||
/// Predecessors of this node.
|
||||
final List<CallGraphNode> pred = [];
|
||||
|
||||
/// Datum associated with this node: a [ProgramInfoNode] (function),
|
||||
/// a [String] (dynamic call selector) or an [int] (dispatch table
|
||||
/// selector id).
|
||||
final data;
|
||||
|
||||
/// Preorder number of this node.
|
||||
///
|
||||
/// Computed by [CallGraph.computeDominators].
|
||||
int _preorderNumber;
|
||||
|
||||
/// Dominator of this node.
|
||||
///
|
||||
/// Computed by [CallGraph.computeDominators].
|
||||
CallGraphNode dominator;
|
||||
|
||||
/// Nodes dominated by this node.
|
||||
///
|
||||
/// Computed by [CallGraph.computeDominators].
|
||||
List<CallGraphNode> dominated = _emptyNodeList;
|
||||
|
||||
CallGraphNode(this.id, {this.data});
|
||||
|
||||
bool get isFunctionNode =>
|
||||
data is ProgramInfoNode && data.type == NodeType.functionNode;
|
||||
|
||||
bool get isClassNode =>
|
||||
data is ProgramInfoNode && data.type == NodeType.classNode;
|
||||
|
||||
bool get isDynamicCallNode => data is String;
|
||||
|
||||
/// Create outgoing edge from this node to the given node [n].
|
||||
void connectTo(CallGraphNode n) {
|
||||
if (n == this) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!succ.contains(n)) {
|
||||
n.pred.add(this);
|
||||
succ.add(n);
|
||||
}
|
||||
}
|
||||
|
||||
void _addDominatedBlock(CallGraphNode n) {
|
||||
if (identical(dominated, _emptyNodeList)) {
|
||||
dominated = [];
|
||||
}
|
||||
dominated.add(n);
|
||||
n.dominator = this;
|
||||
}
|
||||
|
||||
void visitDominatorTree(bool Function(CallGraphNode n, int depth) callback,
|
||||
[int depth = 0]) {
|
||||
if (callback(this, depth)) {
|
||||
for (var n in dominated) {
|
||||
n.visitDominatorTree(callback, depth + 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@override
|
||||
String toString() {
|
||||
return 'CallGraphNode(${data is ProgramInfoNode ? data.qualifiedName : data})';
|
||||
}
|
||||
}
|
||||
|
||||
const _emptyNodeList = <CallGraphNode>[];
|
||||
|
||||
class CallGraph {
|
||||
final ProgramInfo program;
|
||||
final List<CallGraphNode> nodes;
|
||||
|
||||
// Mapping from [ProgramInfoNode] to a corresponding [CallGraphNode] (if any)
|
||||
// via [ProgramInfoNode.id].
|
||||
final List<CallGraphNode> _nodeByEntityId;
|
||||
|
||||
CallGraph._(this.program, this.nodes, this._nodeByEntityId);
|
||||
|
||||
CallGraphNode get root => nodes.first;
|
||||
|
||||
CallGraphNode lookup(ProgramInfoNode node) => _nodeByEntityId[node.id];
|
||||
|
||||
Iterable<CallGraphNode> get dynamicCalls =>
|
||||
nodes.where((n) => n.isDynamicCallNode);
|
||||
|
||||
/// Compute a collapsed version of the call-graph, where
|
||||
CallGraph collapse(NodeType type, {bool dropCallNodes = false}) {
|
||||
final nodesByData = <Object, CallGraphNode>{};
|
||||
final nodeByEntityId = <CallGraphNode>[];
|
||||
|
||||
ProgramInfoNode collapsed(ProgramInfoNode nn) {
|
||||
var n = nn;
|
||||
while (n.parent != null && n.type != type) {
|
||||
n = n.parent;
|
||||
}
|
||||
return n;
|
||||
}
|
||||
|
||||
CallGraphNode nodeFor(Object data) {
|
||||
return nodesByData.putIfAbsent(data, () {
|
||||
final n = CallGraphNode(nodesByData.length, data: data);
|
||||
if (data is ProgramInfoNode) {
|
||||
if (nodeByEntityId.length <= data.id) {
|
||||
nodeByEntityId.length = data.id * 2 + 1;
|
||||
}
|
||||
nodeByEntityId[data.id] = n;
|
||||
}
|
||||
return n;
|
||||
});
|
||||
}
|
||||
|
||||
final newNodes = nodes.map((n) {
|
||||
if (n.data is ProgramInfoNode) {
|
||||
return nodeFor(collapsed(n.data));
|
||||
} else if (!dropCallNodes) {
|
||||
return nodeFor(n.data);
|
||||
}
|
||||
}).toList(growable: false);
|
||||
|
||||
for (var n in nodes) {
|
||||
for (var succ in n.succ) {
|
||||
final from = newNodes[n.id];
|
||||
final to = newNodes[succ.id];
|
||||
|
||||
if (from != null && to != null) {
|
||||
from.connectTo(to);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return CallGraph._(
|
||||
program, nodesByData.values.toList(growable: false), nodeByEntityId);
|
||||
}
|
||||
|
||||
/// Compute dominator tree of the call-graph.
|
||||
///
|
||||
/// The code for dominator tree computation is taken verbatim from the
|
||||
/// native compiler (see runtime/vm/compiler/backend/flow_graph.cc).
|
||||
void computeDominators() {
|
||||
final size = nodes.length;
|
||||
|
||||
// Compute preorder numbering for the graph using DFS.
|
||||
final parent = List<int>.filled(size, -1);
|
||||
final preorder = List<CallGraphNode>.filled(size, null);
|
||||
|
||||
var N = 0;
|
||||
void dfs() {
|
||||
final stack = [_DfsState(p: -1, n: nodes.first)];
|
||||
while (stack.isNotEmpty) {
|
||||
final s = stack.removeLast();
|
||||
final p = s.p;
|
||||
final n = s.n;
|
||||
if (n._preorderNumber == null) {
|
||||
n._preorderNumber = N;
|
||||
preorder[n._preorderNumber] = n;
|
||||
parent[n._preorderNumber] = p;
|
||||
|
||||
for (var w in n.succ) {
|
||||
stack.add(_DfsState(p: n._preorderNumber, n: w));
|
||||
}
|
||||
|
||||
N++;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
dfs();
|
||||
|
||||
for (var node in nodes) {
|
||||
if (node._preorderNumber == null) {
|
||||
print('${node} is unreachable');
|
||||
}
|
||||
}
|
||||
|
||||
// Use the SEMI-NCA algorithm to compute dominators. This is a two-pass
|
||||
// version of the Lengauer-Tarjan algorithm (LT is normally three passes)
|
||||
// that eliminates a pass by using nearest-common ancestor (NCA) to
|
||||
// compute immediate dominators from semidominators. It also removes a
|
||||
// level of indirection in the link-eval forest data structure.
|
||||
//
|
||||
// The algorithm is described in Georgiadis, Tarjan, and Werneck's
|
||||
// "Finding Dominators in Practice".
|
||||
// See http://www.cs.princeton.edu/~rwerneck/dominators/ .
|
||||
|
||||
// All arrays are maps between preorder basic-block numbers.
|
||||
final idom = parent.toList(); // Immediate dominator.
|
||||
final semi = List<int>.generate(size, (i) => i); // Semidominator.
|
||||
final label =
|
||||
List<int>.generate(size, (i) => i); // Label for link-eval forest.
|
||||
|
||||
void compressPath(int start, int current) {
|
||||
final next = parent[current];
|
||||
if (next > start) {
|
||||
compressPath(start, next);
|
||||
label[current] = math.min(label[current], label[next]);
|
||||
parent[current] = parent[next];
|
||||
}
|
||||
}
|
||||
|
||||
// 1. First pass: compute semidominators as in Lengauer-Tarjan.
|
||||
// Semidominators are computed from a depth-first spanning tree and are an
|
||||
// approximation of immediate dominators.
|
||||
|
||||
// Use a link-eval data structure with path compression. Implement path
|
||||
// compression in place by mutating the parent array. Each block has a
|
||||
// label, which is the minimum block number on the compressed path.
|
||||
|
||||
// Loop over the blocks in reverse preorder (not including the graph
|
||||
// entry).
|
||||
for (var block_index = size - 1; block_index >= 1; --block_index) {
|
||||
// Loop over the predecessors.
|
||||
final block = preorder[block_index];
|
||||
// Clear the immediately dominated blocks in case ComputeDominators is
|
||||
// used to recompute them.
|
||||
for (final pred in block.pred) {
|
||||
// Look for the semidominator by ascending the semidominator path
|
||||
// starting from pred.
|
||||
final pred_index = pred._preorderNumber;
|
||||
var best = pred_index;
|
||||
if (pred_index > block_index) {
|
||||
compressPath(block_index, pred_index);
|
||||
best = label[pred_index];
|
||||
}
|
||||
|
||||
// Update the semidominator if we've found a better one.
|
||||
semi[block_index] = math.min(semi[block_index], semi[best]);
|
||||
}
|
||||
|
||||
// Now use label for the semidominator.
|
||||
label[block_index] = semi[block_index];
|
||||
}
|
||||
|
||||
// 2. Compute the immediate dominators as the nearest common ancestor of
|
||||
// spanning tree parent and semidominator, for all blocks except the entry.
|
||||
for (var block_index = 1; block_index < size; ++block_index) {
|
||||
var dom_index = idom[block_index];
|
||||
while (dom_index > semi[block_index]) {
|
||||
dom_index = idom[dom_index];
|
||||
}
|
||||
idom[block_index] = dom_index;
|
||||
preorder[dom_index]._addDominatedBlock(preorder[block_index]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class _DfsState {
|
||||
final int p;
|
||||
final CallGraphNode n;
|
||||
_DfsState({this.p, this.n});
|
||||
}
|
||||
|
||||
/// Helper class for reading `--trace-precompiler-to` output.
|
||||
///
|
||||
/// See README.md for description of the format.
|
||||
class _TraceReader {
|
||||
final List<Object> trace;
|
||||
final List<Object> strings;
|
||||
final List<Object> entities;
|
||||
|
||||
final program = ProgramInfo();
|
||||
|
||||
/// Mapping between entity ids and corresponding [ProgramInfoNode] nodes.
|
||||
final entityById = List<ProgramInfoNode>.filled(1024, null, growable: true);
|
||||
|
||||
/// Mapping between functions (represented as [ProgramInfoNode]s) and
|
||||
/// their selector ids.
|
||||
final selectorIdMap = <ProgramInfoNode, int>{};
|
||||
|
||||
/// Set of functions which can be reached through dynamic dispatch.
|
||||
final dynamicFunctions = Set<ProgramInfoNode>();
|
||||
|
||||
_TraceReader(Map<String, dynamic> data)
|
||||
: strings = data['strings'],
|
||||
entities = data['entities'],
|
||||
trace = data['trace'];
|
||||
|
||||
/// Read all trace events and construct the call graph based on them.
|
||||
CallGraph readTrace() {
|
||||
var pos = 0; // Position in the [trace] array.
|
||||
CallGraphNode currentNode;
|
||||
|
||||
final nodes = <CallGraphNode>[];
|
||||
final nodeByEntityId = <CallGraphNode>[];
|
||||
final callNodesBySelector = <dynamic, CallGraphNode>{};
|
||||
final allocated = Set<ProgramInfoNode>();
|
||||
|
||||
Object next() => trace[pos++];
|
||||
|
||||
CallGraphNode makeNode({dynamic data}) {
|
||||
final n = CallGraphNode(nodes.length, data: data);
|
||||
nodes.add(n);
|
||||
return n;
|
||||
}
|
||||
|
||||
CallGraphNode makeCallNode(dynamic selector) => callNodesBySelector
|
||||
.putIfAbsent(selector, () => makeNode(data: selector));
|
||||
|
||||
CallGraphNode nodeFor(ProgramInfoNode n) {
|
||||
if (nodeByEntityId.length <= n.id) {
|
||||
nodeByEntityId.length = n.id * 2 + 1;
|
||||
}
|
||||
return nodeByEntityId[n.id] ??= makeNode(data: n);
|
||||
}
|
||||
|
||||
void recordDynamicCall(String selector) {
|
||||
currentNode.connectTo(makeCallNode(selector));
|
||||
}
|
||||
|
||||
void recordInterfaceCall(int selector) {
|
||||
currentNode.connectTo(makeCallNode(selector));
|
||||
}
|
||||
|
||||
void recordStaticCall(ProgramInfoNode to) {
|
||||
currentNode.connectTo(nodeFor(to));
|
||||
}
|
||||
|
||||
void recordFieldRef(ProgramInfoNode field) {
|
||||
currentNode.connectTo(nodeFor(field));
|
||||
}
|
||||
|
||||
void recordAllocation(ProgramInfoNode cls) {
|
||||
currentNode.connectTo(nodeFor(cls));
|
||||
allocated.add(cls);
|
||||
}
|
||||
|
||||
bool readRef() {
|
||||
final ref = next();
|
||||
if (ref is int) {
|
||||
final entity = getEntityAt(ref);
|
||||
if (entity.type == NodeType.classNode) {
|
||||
recordAllocation(entity);
|
||||
} else if (entity.type == NodeType.functionNode) {
|
||||
recordStaticCall(entity);
|
||||
} else if (entity.type == NodeType.other) {
|
||||
recordFieldRef(entity);
|
||||
}
|
||||
} else if (ref == 'S') {
|
||||
final String selector = strings[next()];
|
||||
recordDynamicCall(selector);
|
||||
} else if (ref == 'T') {
|
||||
recordInterfaceCall(next());
|
||||
} else if (ref == 'C' || ref == 'E') {
|
||||
pos--;
|
||||
return false;
|
||||
} else {
|
||||
throw FormatException('unexpected ref: ${ref}');
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
void readRefs() {
|
||||
while (readRef()) {}
|
||||
}
|
||||
|
||||
void readEvents() {
|
||||
while (true) {
|
||||
final op = next();
|
||||
switch (op) {
|
||||
case 'E': // End.
|
||||
return;
|
||||
case 'R': // Roots.
|
||||
currentNode = nodeFor(program.root);
|
||||
readRefs();
|
||||
break;
|
||||
case 'C': // Function compilation.
|
||||
currentNode = nodeFor(getEntityAt(next()));
|
||||
readRefs();
|
||||
break;
|
||||
default:
|
||||
throw FormatException('Unknown event: ${op} at ${pos - 1}');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
readEvents();
|
||||
|
||||
// Finally connect nodes representing dynamic and dispatch table calls
|
||||
// to their potential targets.
|
||||
for (var cls in allocated) {
|
||||
for (var fun in cls.children.values.where(dynamicFunctions.contains)) {
|
||||
final funNode = nodeFor(fun);
|
||||
|
||||
callNodesBySelector[selectorIdMap[fun]]?.connectTo(funNode);
|
||||
|
||||
final name = fun.name;
|
||||
callNodesBySelector[name]?.connectTo(funNode);
|
||||
|
||||
const dynPrefix = 'dyn:';
|
||||
const getterPrefix = 'get:';
|
||||
const extractorPrefix = '[tear-off-extractor] ';
|
||||
|
||||
if (!name.startsWith(dynPrefix)) {
|
||||
// Normal methods can be hit by dyn: selectors if the class
|
||||
// does not contain a dedicated dyn: forwarder for this name.
|
||||
if (!cls.children.containsKey('$dynPrefix$name')) {
|
||||
callNodesBySelector['$dynPrefix$name']?.connectTo(funNode);
|
||||
}
|
||||
|
||||
if (name.startsWith(getterPrefix)) {
|
||||
// Handle potential calls through getters: getter get:foo can be
|
||||
// hit by dyn:foo and foo selectors.
|
||||
final targetName = name.substring(getterPrefix.length);
|
||||
callNodesBySelector[targetName]?.connectTo(funNode);
|
||||
callNodesBySelector['$dynPrefix$targetName']?.connectTo(funNode);
|
||||
} else if (name.startsWith(extractorPrefix)) {
|
||||
// Handle method tear-off: [tear-off-extractor] get:foo is hit
|
||||
// by get:foo.
|
||||
callNodesBySelector[name.substring(extractorPrefix.length)]
|
||||
?.connectTo(funNode);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return CallGraph._(program, nodes, nodeByEntityId);
|
||||
}
|
||||
|
||||
/// Return [ProgramInfoNode] representing the entity with the given [id].
|
||||
ProgramInfoNode getEntityAt(int id) {
|
||||
if (entityById.length <= id) {
|
||||
entityById.length = id * 2;
|
||||
}
|
||||
|
||||
// Entity records have fixed size which allows us to perform random access.
|
||||
const elementsPerEntity = 4;
|
||||
return entityById[id] ??= readEntityAt(id * elementsPerEntity);
|
||||
}
|
||||
|
||||
/// Read the entity at the given [index] in [entities].
|
||||
ProgramInfoNode readEntityAt(int index) {
|
||||
final type = entities[index];
|
||||
switch (type) {
|
||||
case 'C': // Class: 'C', <library-uri-idx>, <name-idx>, 0
|
||||
final libraryUri = strings[entities[index + 1]];
|
||||
final className = strings[entities[index + 2]];
|
||||
|
||||
return program.makeNode(
|
||||
name: className,
|
||||
parent: getLibraryNode(libraryUri),
|
||||
type: NodeType.classNode);
|
||||
|
||||
case 'S':
|
||||
case 'F': // Function: 'F'|'S', <class-idx>, <name-idx>, <selector-id>
|
||||
final classNode = getEntityAt(entities[index + 1]);
|
||||
final functionName = strings[entities[index + 2]];
|
||||
final int selectorId = entities[index + 3];
|
||||
|
||||
final path = Name(functionName).rawComponents;
|
||||
if (path.last == 'FfiTrampoline') {
|
||||
path[path.length - 1] = '${path.last}@$index';
|
||||
}
|
||||
var node = program.makeNode(
|
||||
name: path.first, parent: classNode, type: NodeType.functionNode);
|
||||
for (var name in path.skip(1)) {
|
||||
node = program.makeNode(
|
||||
name: name, parent: node, type: NodeType.functionNode);
|
||||
}
|
||||
if (selectorId >= 0) {
|
||||
selectorIdMap[node] = selectorId;
|
||||
}
|
||||
if (type == 'F') {
|
||||
dynamicFunctions.add(node);
|
||||
}
|
||||
return node;
|
||||
|
||||
case 'V': // Field: 'V', <class-idx>, <name-idx>, 0
|
||||
final classNode = getEntityAt(entities[index + 1]);
|
||||
final fieldName = strings[entities[index + 2]];
|
||||
|
||||
return program.makeNode(
|
||||
name: fieldName, parent: classNode, type: NodeType.other);
|
||||
|
||||
default:
|
||||
throw FormatException('unrecognized entity type ${type}');
|
||||
}
|
||||
}
|
||||
|
||||
ProgramInfoNode getLibraryNode(String libraryUri) {
|
||||
final package = packageOf(libraryUri);
|
||||
var node = program.root;
|
||||
if (package != libraryUri) {
|
||||
node = program.makeNode(
|
||||
name: package, parent: node, type: NodeType.packageNode);
|
||||
}
|
||||
return program.makeNode(
|
||||
name: libraryUri, parent: node, type: NodeType.libraryNode);
|
||||
}
|
||||
}
|
|
@ -88,17 +88,23 @@ class ProgramInfo {
|
|||
recurse(root);
|
||||
}
|
||||
|
||||
int get totalSize {
|
||||
var result = 0;
|
||||
visit((pkg, lib, cls, fun, node) {
|
||||
result += node.size ?? 0;
|
||||
});
|
||||
return result;
|
||||
}
|
||||
/// Total size of all the nodes in the program.
|
||||
int get totalSize => root.totalSize;
|
||||
|
||||
/// Convert this program info to a JSON map using [infoToJson] to convert
|
||||
/// data attached to nodes into its JSON representation.
|
||||
Map<String, dynamic> toJson() => root.toJson();
|
||||
|
||||
/// Lookup a node in the program given a path to it.
|
||||
ProgramInfoNode lookup(List<String> path) {
|
||||
var n = root;
|
||||
for (var p in path) {
|
||||
if ((n = n.children[p]) == null) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
return n;
|
||||
}
|
||||
}
|
||||
|
||||
enum NodeType {
|
||||
|
@ -114,6 +120,7 @@ String _typeToJson(NodeType type) => const {
|
|||
NodeType.libraryNode: 'library',
|
||||
NodeType.classNode: 'class',
|
||||
NodeType.functionNode: 'function',
|
||||
NodeType.other: 'other',
|
||||
}[type];
|
||||
|
||||
class ProgramInfoNode {
|
||||
|
@ -161,6 +168,46 @@ class ProgramInfoNode {
|
|||
if (children.isNotEmpty)
|
||||
for (var clo in children.entries) clo.key: clo.value.toJson()
|
||||
};
|
||||
|
||||
/// Returns the name of this node prefixed by the [qualifiedName] of its
|
||||
/// [parent].
|
||||
String get qualifiedName {
|
||||
var prefix = '';
|
||||
// Do not include root name or package name (library uri already contains
|
||||
// package name).
|
||||
if (parent?.parent != null && parent?.type != NodeType.packageNode) {
|
||||
prefix = parent.qualifiedName;
|
||||
if (parent.type != NodeType.libraryNode) {
|
||||
prefix += '.';
|
||||
} else {
|
||||
prefix += '::';
|
||||
}
|
||||
}
|
||||
return '$prefix$name';
|
||||
}
|
||||
|
||||
@override
|
||||
String toString() {
|
||||
return '${_typeToJson(type)} ${qualifiedName}';
|
||||
}
|
||||
|
||||
/// Returns path to this node such that [ProgramInfo.lookup] would return
|
||||
/// this node given its [path].
|
||||
List<String> get path {
|
||||
final result = <String>[];
|
||||
var n = this;
|
||||
while (n.parent != null) {
|
||||
result.add(n.name);
|
||||
n = n.parent;
|
||||
}
|
||||
return result.reversed.toList();
|
||||
}
|
||||
|
||||
/// Cumulative size of this node and all of its children.
|
||||
int get totalSize {
|
||||
return (size ?? 0) +
|
||||
children.values.fold<int>(0, (s, n) => s + n.totalSize);
|
||||
}
|
||||
}
|
||||
|
||||
/// Computes the size difference between two [ProgramInfo].
|
||||
|
@ -239,6 +286,14 @@ class Histogram {
|
|||
|
||||
return Histogram._(bucketInfo, buckets);
|
||||
}
|
||||
|
||||
/// Rebuckets the histogram given the new bucketing rule.
|
||||
Histogram map(String Function(String) bucketFor) {
|
||||
return Histogram.fromIterable(buckets.keys,
|
||||
sizeOf: (key) => buckets[key],
|
||||
bucketFor: bucketFor,
|
||||
bucketInfo: bucketInfo);
|
||||
}
|
||||
}
|
||||
|
||||
/// Construct the histogram of specific [type] given a [ProgramInfo].
|
||||
|
@ -280,11 +335,10 @@ Histogram computeHistogram(ProgramInfo info, HistogramType type,
|
|||
if (node.size == null || node.size == 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
final bucket = bucketing.bucketFor(pkg, lib, cls, fun);
|
||||
if (!matchesFilter(lib, cls, fun)) {
|
||||
return;
|
||||
}
|
||||
final bucket = bucketing.bucketFor(pkg, lib, cls, fun);
|
||||
buckets[bucket] = (buckets[bucket] ?? 0) + node.size;
|
||||
});
|
||||
|
||||
|
|
125
pkg/vm_snapshot_analysis/lib/src/commands/explain.dart
Normal file
125
pkg/vm_snapshot_analysis/lib/src/commands/explain.dart
Normal file
|
@ -0,0 +1,125 @@
|
|||
// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
|
||||
// for details. All rights reserved. Use of this source code is governed by a
|
||||
// BSD-style license that can be found in the LICENSE file.
|
||||
|
||||
/// This command allows to inspect information written into the
|
||||
/// precompiler trace (`--trace-precompiler-to` output).
|
||||
library vm_snapshot_analysis.explain;
|
||||
|
||||
import 'dart:async';
|
||||
import 'dart:io';
|
||||
import 'dart:math' as math;
|
||||
|
||||
import 'package:args/command_runner.dart';
|
||||
|
||||
import 'package:vm_snapshot_analysis/name.dart';
|
||||
import 'package:vm_snapshot_analysis/precompiler_trace.dart';
|
||||
import 'package:vm_snapshot_analysis/program_info.dart';
|
||||
import 'package:vm_snapshot_analysis/utils.dart';
|
||||
|
||||
class ExplainCommand extends Command<void> {
|
||||
@override
|
||||
final name = 'explain';
|
||||
|
||||
@override
|
||||
final description = '''
|
||||
Explain why certain methods were pulled into the binary.
|
||||
''';
|
||||
|
||||
ExplainCommand() {
|
||||
addSubcommand(ExplainDynamicCallsCommand());
|
||||
}
|
||||
}
|
||||
|
||||
/// Generates a summary report about dynamic calls sorted by approximation
|
||||
/// of their retained size, i.e. the amount of bytes these calls are pulling
|
||||
/// into the snapshot.
|
||||
class ExplainDynamicCallsCommand extends Command<void> {
|
||||
@override
|
||||
final name = 'dynamic-calls';
|
||||
|
||||
@override
|
||||
final description = '''
|
||||
This command explains impact of the dynamic calls on the binary size.
|
||||
|
||||
It needs AOT snapshot size profile (an output of either
|
||||
--write-v8-snapshot-profile-to or --print-instructions-sizes-to flags) and
|
||||
precompiler trace (an output of --trace-precompiler-to flag).
|
||||
''';
|
||||
|
||||
ExplainCommand() {}
|
||||
|
||||
@override
|
||||
Future<void> run() async {
|
||||
final sizesJson = File(argResults.rest[0]);
|
||||
if (!sizesJson.existsSync()) {
|
||||
usageException('Size profile ${sizesJson.path} does not exist!');
|
||||
}
|
||||
|
||||
final traceJson = File(argResults.rest[1]);
|
||||
if (!traceJson.existsSync()) {
|
||||
usageException('Size profile ${traceJson.path} does not exist!');
|
||||
}
|
||||
|
||||
final callGraph = await loadTrace(traceJson);
|
||||
callGraph.computeDominators();
|
||||
|
||||
final programInfo = await loadProgramInfo(sizesJson);
|
||||
|
||||
final histogram = Histogram.fromIterable<CallGraphNode>(
|
||||
callGraph.dynamicCalls, sizeOf: (dynamicCall) {
|
||||
// Compute approximate retained size by traversing the dominator tree
|
||||
// and consulting snapshot profile.
|
||||
var totalSize = 0;
|
||||
dynamicCall.visitDominatorTree((retained, depth) {
|
||||
if (retained.isFunctionNode) {
|
||||
// Note that call graph keeps private library keys intact in the
|
||||
// names (because we need to distinguish dynamic invocations
|
||||
// through with the same private name in different libraries).
|
||||
// So we need to scrub the path before we lookup information in the
|
||||
// profile.
|
||||
final path = (retained.data as ProgramInfoNode)
|
||||
.path
|
||||
.map((n) => Name(n).scrubbed)
|
||||
.toList();
|
||||
if (path.last.startsWith('[tear-off] ')) {
|
||||
// Tear-off forwarder is placed into the function that is torn so
|
||||
// we need to slightly tweak the path to be able to find it.
|
||||
path.insert(
|
||||
path.length - 1, path.last.replaceAll('[tear-off] ', ''));
|
||||
}
|
||||
final retainedSize = programInfo.lookup(path);
|
||||
totalSize += (retainedSize?.totalSize ?? 0);
|
||||
}
|
||||
return true;
|
||||
});
|
||||
return totalSize;
|
||||
}, bucketFor: (n) {
|
||||
return (n.data as String).replaceAll('dyn:', '');
|
||||
}, bucketInfo: BucketInfo(nameComponents: ['Selector']));
|
||||
|
||||
printHistogram(programInfo, histogram,
|
||||
prefix: histogram.bySize.where((key) => histogram.buckets[key] > 0));
|
||||
|
||||
// For top 10 dynamic selectors print the functions which contain these
|
||||
// dynamic calls.
|
||||
for (var selector
|
||||
in histogram.bySize.take(math.min(10, histogram.length))) {
|
||||
final dynSelector = 'dyn:$selector';
|
||||
final callNodes = callGraph.nodes
|
||||
.where((n) => n.data == selector || n.data == dynSelector);
|
||||
|
||||
print('\nDynamic call to ${selector}'
|
||||
' (retaining ~${histogram.buckets[selector]} bytes) occurs in:');
|
||||
for (var node in callNodes) {
|
||||
for (var pred in node.pred) {
|
||||
print(' ${pred.data.qualifiedName}');
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@override
|
||||
String get invocation =>
|
||||
super.invocation.replaceAll('[arguments]', '<sizes.json> <trace.json>');
|
||||
}
|
|
@ -9,10 +9,13 @@ library vm_snapshot_analysis.summary;
|
|||
|
||||
import 'dart:async';
|
||||
import 'dart:io';
|
||||
import 'dart:math' as math;
|
||||
|
||||
import 'package:args/command_runner.dart';
|
||||
import 'package:meta/meta.dart';
|
||||
|
||||
import 'package:vm_snapshot_analysis/ascii_table.dart';
|
||||
import 'package:vm_snapshot_analysis/precompiler_trace.dart';
|
||||
import 'package:vm_snapshot_analysis/program_info.dart';
|
||||
import 'package:vm_snapshot_analysis/utils.dart';
|
||||
import 'package:vm_snapshot_analysis/v8_profile.dart';
|
||||
|
@ -46,6 +49,22 @@ This tool can process snapshot size reports produced by
|
|||
abbr: 'w',
|
||||
help: 'Filter output using the given glob.',
|
||||
)
|
||||
..addOption(
|
||||
'precompiler-trace',
|
||||
abbr: 't',
|
||||
help: '''
|
||||
Precompiler trace to establish dependencies between libraries/packages.
|
||||
''',
|
||||
)
|
||||
..addOption(
|
||||
'deps-collapse-depth',
|
||||
abbr: 'd',
|
||||
defaultsTo: '3',
|
||||
help: '''
|
||||
Depth at which nodes in the dependency tree are collapsed together.
|
||||
Only has affect if --precompiler-trace is also passed.
|
||||
''',
|
||||
)
|
||||
..addFlag('collapse-anonymous-closures', help: '''
|
||||
Collapse all anonymous closures from the same scope into a single entry.
|
||||
When comparing size of AOT snapshots for two different versions of a
|
||||
|
@ -74,6 +93,21 @@ precisely based on their source position (which is included in their name).
|
|||
usageException('Input file ${input.path} does not exist!');
|
||||
}
|
||||
|
||||
final granularity = _parseHistogramType(argResults['by']);
|
||||
|
||||
final traceJson = argResults['precompiler-trace'];
|
||||
if (traceJson != null) {
|
||||
if (!File(traceJson).existsSync()) {
|
||||
usageException('Trace ${traceJson} does not exist!');
|
||||
}
|
||||
|
||||
if (granularity != HistogramType.byPackage &&
|
||||
granularity != HistogramType.byLibrary) {
|
||||
usageException(
|
||||
'--precompiler-trace only has effect when summarizing by library or package');
|
||||
}
|
||||
}
|
||||
|
||||
final columnWidth = argResults['column-width'];
|
||||
final maxWidth = int.tryParse(columnWidth);
|
||||
if (maxWidth == null) {
|
||||
|
@ -81,11 +115,20 @@ precisely based on their source position (which is included in their name).
|
|||
'Specified column width (${columnWidth}) is not an integer');
|
||||
}
|
||||
|
||||
final depthCollapseDepthStr = argResults['deps-collapse-depth'];
|
||||
final depsCollapseDepth = int.tryParse(depthCollapseDepthStr);
|
||||
if (depsCollapseDepth == null) {
|
||||
usageException('Specified depthCollapseDepth (${depthCollapseDepthStr})'
|
||||
' is not an integer');
|
||||
}
|
||||
|
||||
await outputSummary(input,
|
||||
maxWidth: maxWidth,
|
||||
granularity: _parseHistogramType(argResults['by']),
|
||||
granularity: granularity,
|
||||
collapseAnonymousClosures: argResults['collapse-anonymous-closures'],
|
||||
filter: argResults['where']);
|
||||
filter: argResults['where'],
|
||||
traceJson: traceJson != null ? File(traceJson) : null,
|
||||
depsCollapseDepth: depsCollapseDepth);
|
||||
}
|
||||
|
||||
static HistogramType _parseHistogramType(String value) {
|
||||
|
@ -107,14 +150,107 @@ void outputSummary(File input,
|
|||
{int maxWidth = 0,
|
||||
bool collapseAnonymousClosures = false,
|
||||
HistogramType granularity = HistogramType.bySymbol,
|
||||
String filter}) async {
|
||||
String filter,
|
||||
File traceJson,
|
||||
int depsCollapseDepth = 3,
|
||||
int topToReport = 30}) async {
|
||||
final info = await loadProgramInfo(input);
|
||||
|
||||
// Compute histogram.
|
||||
final histogram = computeHistogram(info, granularity, filter: filter);
|
||||
var histogram = computeHistogram(info, granularity, filter: filter);
|
||||
|
||||
// If precompiler trace is provide collapse entries based on the dependency
|
||||
// graph (dominator tree) extracted from the trace.
|
||||
void Function() printDependencyTrees;
|
||||
if (traceJson != null &&
|
||||
(granularity == HistogramType.byLibrary ||
|
||||
granularity == HistogramType.byPackage)) {
|
||||
var callGraph = await loadTrace(traceJson);
|
||||
|
||||
// Convert call graph into the approximate dependency graph, dropping any
|
||||
// dynamic and dispatch table based dependencies from the graph and only
|
||||
// following the static call, field access and allocation edges.
|
||||
callGraph = callGraph.collapse(
|
||||
granularity == HistogramType.byLibrary
|
||||
? NodeType.libraryNode
|
||||
: NodeType.packageNode,
|
||||
dropCallNodes: true);
|
||||
callGraph.computeDominators();
|
||||
|
||||
// Compute name mapping from histogram buckets to new coarser buckets, by
|
||||
// collapsing dependency tree at [depsCollapseDepth] level: node 'Foo' with
|
||||
// k dominated children (k > 0) becomes 'Foo (+k deps)' and all its children
|
||||
// are remapped to this bucket.
|
||||
final mapping = <String, String>{};
|
||||
final collapsed = <String, CallGraphNode>{};
|
||||
callGraph.root.visitDominatorTree((n, depth) {
|
||||
if (depth >= depsCollapseDepth) {
|
||||
final children = <String>[];
|
||||
n.visitDominatorTree((child, depth) {
|
||||
if (n != child && child.data is ProgramInfoNode) {
|
||||
children.add(child.data.name);
|
||||
}
|
||||
return true;
|
||||
}, depth + 1);
|
||||
|
||||
if (children.isNotEmpty) {
|
||||
final newName = '${n.data.name} (+ ${children.length} deps)';
|
||||
mapping[n.data.name] = newName;
|
||||
collapsed[newName] = n;
|
||||
for (var name in children) {
|
||||
mapping[name] = newName;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
});
|
||||
|
||||
// Compute cumulative sizes and node counts for each node in the dominator
|
||||
// tree. We are going to use this information later to display dependency
|
||||
// trees at the end of the summary report.
|
||||
// This needs to be done before we loose original histogram.
|
||||
final totalSizes = <String, int>{};
|
||||
final totalCounts = <String, int>{};
|
||||
void computeTotalsRecursively(CallGraphNode node) {
|
||||
var totalSize = histogram.buckets[node.data.name] ?? 0;
|
||||
var totalCount = 1;
|
||||
for (var n in node.dominated) {
|
||||
computeTotalsRecursively(n);
|
||||
totalSize += totalSizes[n.data.name];
|
||||
totalCount += totalCounts[n.data.name];
|
||||
}
|
||||
totalSizes[node.data.name] = totalSize;
|
||||
totalCounts[node.data.name] = totalCount;
|
||||
}
|
||||
|
||||
computeTotalsRecursively(callGraph.root);
|
||||
|
||||
// Transform the histogram using the mapping which we computed.
|
||||
histogram = histogram.map((bucket) => mapping[bucket] ?? bucket);
|
||||
|
||||
// Create a helper function to print dependency trees at the end of the
|
||||
// report.
|
||||
printDependencyTrees = () {
|
||||
// This will be the list of collapsed entries which were among those
|
||||
// [topToReport] printed by [printHistogram] below.
|
||||
final collapsedEntries = histogram.bySize
|
||||
.take(topToReport)
|
||||
.map((k) => collapsed[k])
|
||||
.where((n) => n != null);
|
||||
if (collapsedEntries.isNotEmpty) {
|
||||
print('\bDependency trees:');
|
||||
for (var n in collapsedEntries) {
|
||||
print(
|
||||
'\n${n.data.qualifiedName} (total ${totalSizes[n.data.name]} bytes)');
|
||||
_printDominatedNodes(n,
|
||||
totalSizes: totalSizes, totalCounts: totalCounts);
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Now produce the report table.
|
||||
const topToReport = 30;
|
||||
printHistogram(info, histogram,
|
||||
prefix: histogram.bySize.take(topToReport), maxWidth: maxWidth);
|
||||
|
||||
|
@ -127,4 +263,70 @@ void outputSummary(File input,
|
|||
|
||||
print(bucketLegend);
|
||||
}
|
||||
|
||||
printDependencyTrees?.call();
|
||||
}
|
||||
|
||||
/// Helper method for printing dominator tree in the form:
|
||||
///
|
||||
/// A (total ... bytes)
|
||||
/// ├── B (total ... bytes)
|
||||
/// ├── C (total ... bytes)
|
||||
/// │ ├── D (total ... bytes)
|
||||
/// │ └── E (total ... bytes)
|
||||
/// ├── F (total ... bytes)
|
||||
/// └── G (total ... bytes)
|
||||
///
|
||||
/// Cuts the printing off at the given depth ([cutOffDepth]) and after the
|
||||
/// given amount of children at each node ([maxChildrenToPrint]).
|
||||
void _printDominatedNodes(CallGraphNode node,
|
||||
{int cutOffDepth = 2,
|
||||
int maxChildrenToPrint = 10,
|
||||
List<bool> isLastPerLevel,
|
||||
@required Map<String, int> totalSizes,
|
||||
@required Map<String, int> totalCounts}) {
|
||||
isLastPerLevel ??= [];
|
||||
|
||||
if (isLastPerLevel.length >= cutOffDepth) {
|
||||
maxChildrenToPrint = 0;
|
||||
}
|
||||
|
||||
final sizes = node.dominated.map((n) => totalSizes[n.data.name]).toList();
|
||||
final order = List.generate(node.dominated.length, (i) => i)
|
||||
..sort((a, b) => sizes[b] - sizes[a]);
|
||||
final lastIndex = order.lastIndexWhere((i) => sizes[i] > 0);
|
||||
|
||||
for (var j = 0, n = math.min(maxChildrenToPrint - 1, lastIndex);
|
||||
j <= n;
|
||||
j++) {
|
||||
final isLast = j == lastIndex;
|
||||
final i = order[j];
|
||||
final n = node.dominated[i];
|
||||
final size = sizes[i];
|
||||
isLastPerLevel.add(isLast);
|
||||
print(
|
||||
'${_treeLines(isLastPerLevel)}${n.data.qualifiedName} (total ${size} bytes)');
|
||||
_printDominatedNodes(n,
|
||||
cutOffDepth: cutOffDepth,
|
||||
isLastPerLevel: isLastPerLevel,
|
||||
totalCounts: totalCounts,
|
||||
totalSizes: totalSizes);
|
||||
isLastPerLevel.removeLast();
|
||||
}
|
||||
|
||||
if (maxChildrenToPrint < lastIndex) {
|
||||
isLastPerLevel.add(true);
|
||||
print(
|
||||
'${_treeLines(isLastPerLevel)} ... (+${totalCounts[node.data.name] - 1} deps)');
|
||||
isLastPerLevel.removeLast();
|
||||
}
|
||||
}
|
||||
|
||||
String _treeLines(List<bool> isLastPerLevel) {
|
||||
final sb = StringBuffer();
|
||||
for (var i = 0; i < isLastPerLevel.length - 1; i++) {
|
||||
sb.write(isLastPerLevel[i] ? ' ' : '│ ');
|
||||
}
|
||||
sb.write(isLastPerLevel.last ? '└── ' : '├── ');
|
||||
return sb.toString();
|
||||
}
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
name: vm_snapshot_analysis
|
||||
description: Utilities for working with non-symbolic stack traces.
|
||||
description: Utilities for analysing AOT snapshot size.
|
||||
version: 0.4.0
|
||||
|
||||
homepage: https://github.com/dart-lang/sdk/tree/master/pkg/vm_snapshot_analysis
|
||||
|
|
|
@ -4,7 +4,6 @@
|
|||
|
||||
import 'dart:io';
|
||||
|
||||
import 'package:path/path.dart' as path;
|
||||
import 'package:test/test.dart';
|
||||
|
||||
import 'package:vm_snapshot_analysis/instruction_sizes.dart'
|
||||
|
@ -12,17 +11,7 @@ import 'package:vm_snapshot_analysis/instruction_sizes.dart'
|
|||
import 'package:vm_snapshot_analysis/program_info.dart';
|
||||
import 'package:vm_snapshot_analysis/utils.dart';
|
||||
|
||||
final dart2native = () {
|
||||
final sdkBin = path.dirname(Platform.executable);
|
||||
final dart2native =
|
||||
path.join(sdkBin, Platform.isWindows ? 'dart2native.bat' : 'dart2native');
|
||||
|
||||
if (!File(dart2native).existsSync()) {
|
||||
throw 'Failed to locate dart2native in the SDK';
|
||||
}
|
||||
|
||||
return path.canonicalize(dart2native);
|
||||
}();
|
||||
import 'utils.dart';
|
||||
|
||||
final testSource = {
|
||||
'input.dart': """
|
||||
|
@ -651,61 +640,6 @@ Future withV8Profile(String prefix, Map<String, String> source,
|
|||
Future Function(String sizesJson) f) =>
|
||||
withFlag(prefix, source, '--write_v8_snapshot_profile_to', f);
|
||||
|
||||
Future withFlag(String prefix, Map<String, String> source, String flag,
|
||||
Future Function(String sizesJson) f) {
|
||||
return withTempDir(prefix, (dir) async {
|
||||
final outputBinary = path.join(dir, 'output.exe');
|
||||
final sizesJson = path.join(dir, 'sizes.json');
|
||||
final packages = path.join(dir, '.packages');
|
||||
final mainDart = path.join(dir, 'main.dart');
|
||||
|
||||
// Create test input.
|
||||
for (var file in source.entries) {
|
||||
await File(path.join(dir, file.key)).writeAsString(file.value);
|
||||
}
|
||||
await File(packages).writeAsString('''
|
||||
input:./
|
||||
''');
|
||||
await File(mainDart).writeAsString('''
|
||||
import 'package:input/input.dart' as input;
|
||||
|
||||
void main(List<String> args) => input.main(args);
|
||||
''');
|
||||
|
||||
// Compile input.dart to native and output instruction sizes.
|
||||
final result = await Process.run(dart2native, [
|
||||
'-o',
|
||||
outputBinary,
|
||||
'--packages=$packages',
|
||||
'--extra-gen-snapshot-options=$flag=$sizesJson',
|
||||
mainDart,
|
||||
]);
|
||||
|
||||
expect(result.exitCode, equals(0), reason: '''
|
||||
Compilation completed successfully.
|
||||
|
||||
stdout: ${result.stdout}
|
||||
stderr: ${result.stderr}
|
||||
''');
|
||||
expect(File(outputBinary).existsSync(), isTrue,
|
||||
reason: 'Output binary exists');
|
||||
expect(File(sizesJson).existsSync(), isTrue,
|
||||
reason: 'Instruction sizes output exists');
|
||||
|
||||
await f(sizesJson);
|
||||
});
|
||||
}
|
||||
|
||||
Future withTempDir(String prefix, Future Function(String dir) f) async {
|
||||
final tempDir =
|
||||
Directory.systemTemp.createTempSync('instruction-sizes-test-${prefix}');
|
||||
try {
|
||||
await f(tempDir.path);
|
||||
} finally {
|
||||
tempDir.deleteSync(recursive: true);
|
||||
}
|
||||
}
|
||||
|
||||
// On Windows there is some issue with interpreting entry point URI as a package URI
|
||||
// it instead gets interpreted as a file URI - which breaks comparison. So we
|
||||
// simply ignore entry point library (main.dart).
|
||||
|
|
102
pkg/vm_snapshot_analysis/test/precompiler_trace_test.dart
Normal file
102
pkg/vm_snapshot_analysis/test/precompiler_trace_test.dart
Normal file
|
@ -0,0 +1,102 @@
|
|||
// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
|
||||
// for details. All rights reserved. Use of this source code is governed by a
|
||||
// BSD-style license that can be found in the LICENSE file.
|
||||
|
||||
import 'dart:io';
|
||||
|
||||
import 'package:test/test.dart';
|
||||
import 'package:vm_snapshot_analysis/precompiler_trace.dart';
|
||||
|
||||
import 'utils.dart';
|
||||
|
||||
final testSource = {
|
||||
'input.dart': """
|
||||
class K {
|
||||
final value;
|
||||
const K(this.value);
|
||||
}
|
||||
|
||||
@pragma('vm:never-inline')
|
||||
dynamic makeSomeClosures() {
|
||||
return [
|
||||
() => const K(0),
|
||||
() => const K(1),
|
||||
() => 11,
|
||||
];
|
||||
}
|
||||
|
||||
class A {
|
||||
@pragma('vm:never-inline')
|
||||
dynamic tornOff() {
|
||||
return const K(2);
|
||||
}
|
||||
}
|
||||
|
||||
class B {
|
||||
@pragma('vm:never-inline')
|
||||
dynamic tornOff() {
|
||||
return const K(3);
|
||||
}
|
||||
}
|
||||
|
||||
class C {
|
||||
static dynamic tornOff() async {
|
||||
return const K(4);
|
||||
}
|
||||
}
|
||||
|
||||
@pragma('vm:never-inline')
|
||||
Function tearOff(dynamic o) {
|
||||
return o.tornOff;
|
||||
}
|
||||
|
||||
void main(List<String> args) {
|
||||
for (var cl in makeSomeClosures()) {
|
||||
print(cl());
|
||||
}
|
||||
print(tearOff(args.isEmpty ? A() : B()));
|
||||
print(C.tornOff);
|
||||
}
|
||||
"""
|
||||
};
|
||||
|
||||
void main() async {
|
||||
if (!Platform.executable.contains('dart-sdk')) {
|
||||
// If we are not running from the prebuilt SDK then this test does nothing.
|
||||
return;
|
||||
}
|
||||
|
||||
group('precompiler-trace', () {
|
||||
test('basic-parsing', () async {
|
||||
await withFlag('basic-parsing', testSource, '--trace_precompiler_to',
|
||||
(json) async {
|
||||
final callGraph = await loadTrace(File(json));
|
||||
callGraph.computeDominators();
|
||||
|
||||
final main = callGraph.program
|
||||
.lookup(['package:input', 'package:input/input.dart', '', 'main']);
|
||||
final mainNode = callGraph.lookup(main);
|
||||
|
||||
final retainedClasses = mainNode.dominated
|
||||
.where((n) => n.isClassNode)
|
||||
.map((n) => n.data.name)
|
||||
.toList();
|
||||
final retainedFunctions = mainNode.dominated
|
||||
.where((n) => n.isFunctionNode)
|
||||
.map((n) => n.data.name)
|
||||
.toList();
|
||||
expect(retainedClasses, containsAll(['A', 'B', 'K']));
|
||||
expect(retainedFunctions, containsAll(['print', 'tearOff']));
|
||||
|
||||
final getTearOffCall =
|
||||
callGraph.dynamicCalls.firstWhere((n) => n.data == 'get:tornOff');
|
||||
expect(
|
||||
getTearOffCall.dominated.map((n) => n.data.qualifiedName),
|
||||
equals([
|
||||
'package:input/input.dart::B.[tear-off-extractor] get:tornOff',
|
||||
'package:input/input.dart::A.[tear-off-extractor] get:tornOff',
|
||||
]));
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
75
pkg/vm_snapshot_analysis/test/utils.dart
Normal file
75
pkg/vm_snapshot_analysis/test/utils.dart
Normal file
|
@ -0,0 +1,75 @@
|
|||
// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
|
||||
// for details. All rights reserved. Use of this source code is governed by a
|
||||
// BSD-style license that can be found in the LICENSE file.
|
||||
|
||||
import 'dart:io';
|
||||
|
||||
import 'package:path/path.dart' as path;
|
||||
import 'package:test/test.dart';
|
||||
|
||||
final dart2native = () {
|
||||
final sdkBin = path.dirname(Platform.executable);
|
||||
final dart2native =
|
||||
path.join(sdkBin, Platform.isWindows ? 'dart2native.bat' : 'dart2native');
|
||||
|
||||
if (!File(dart2native).existsSync()) {
|
||||
throw 'Failed to locate dart2native in the SDK';
|
||||
}
|
||||
|
||||
return path.canonicalize(dart2native);
|
||||
}();
|
||||
|
||||
Future withFlag(String prefix, Map<String, String> source, String flag,
|
||||
Future Function(String sizesJson) f) {
|
||||
return withTempDir(prefix, (dir) async {
|
||||
final outputBinary = path.join(dir, 'output.exe');
|
||||
final sizesJson = path.join(dir, 'sizes.json');
|
||||
final packages = path.join(dir, '.packages');
|
||||
final mainDart = path.join(dir, 'main.dart');
|
||||
|
||||
// Create test input.
|
||||
for (var file in source.entries) {
|
||||
await File(path.join(dir, file.key)).writeAsString(file.value);
|
||||
}
|
||||
await File(packages).writeAsString('''
|
||||
input:./
|
||||
''');
|
||||
await File(mainDart).writeAsString('''
|
||||
import 'package:input/input.dart' as input;
|
||||
|
||||
void main(List<String> args) => input.main(args);
|
||||
''');
|
||||
|
||||
// Compile input.dart to native and output instruction sizes.
|
||||
final result = await Process.run(dart2native, [
|
||||
'-o',
|
||||
outputBinary,
|
||||
'--packages=$packages',
|
||||
'--extra-gen-snapshot-options=$flag=$sizesJson',
|
||||
mainDart,
|
||||
]);
|
||||
|
||||
expect(result.exitCode, equals(0), reason: '''
|
||||
Compilation completed successfully.
|
||||
|
||||
stdout: ${result.stdout}
|
||||
stderr: ${result.stderr}
|
||||
''');
|
||||
expect(File(outputBinary).existsSync(), isTrue,
|
||||
reason: 'Output binary exists');
|
||||
expect(File(sizesJson).existsSync(), isTrue,
|
||||
reason: 'Instruction sizes output exists');
|
||||
|
||||
await f(sizesJson);
|
||||
});
|
||||
}
|
||||
|
||||
Future withTempDir(String prefix, Future Function(String dir) f) async {
|
||||
final tempDir =
|
||||
Directory.systemTemp.createTempSync('instruction-sizes-test-${prefix}');
|
||||
try {
|
||||
await f(tempDir.path);
|
||||
} finally {
|
||||
tempDir.deleteSync(recursive: true);
|
||||
}
|
||||
}
|
|
@ -16,6 +16,7 @@ namespace dart {
|
|||
|
||||
class ClassTable;
|
||||
class Precompiler;
|
||||
class PrecompilerTracer;
|
||||
|
||||
namespace compiler {
|
||||
|
||||
|
@ -75,6 +76,7 @@ class SelectorMap {
|
|||
int32_t NumIds() const { return selectors_.length(); }
|
||||
|
||||
friend class dart::Precompiler;
|
||||
friend class dart::PrecompilerTracer;
|
||||
friend class DispatchTableGenerator;
|
||||
friend class SelectorRow;
|
||||
|
||||
|
|
|
@ -8,6 +8,7 @@
|
|||
#include "vm/class_finalizer.h"
|
||||
#include "vm/code_patcher.h"
|
||||
#include "vm/compiler/aot/aot_call_specializer.h"
|
||||
#include "vm/compiler/aot/precompiler_tracer.h"
|
||||
#include "vm/compiler/assembler/assembler.h"
|
||||
#include "vm/compiler/assembler/disassembler.h"
|
||||
#include "vm/compiler/backend/branch_optimizer.h"
|
||||
|
@ -50,6 +51,7 @@
|
|||
#include "vm/type_table.h"
|
||||
#include "vm/type_testing_stubs.h"
|
||||
#include "vm/version.h"
|
||||
#include "vm/zone_text_buffer.h"
|
||||
|
||||
namespace dart {
|
||||
|
||||
|
@ -268,6 +270,8 @@ void Precompiler::DoCompileAll() {
|
|||
}
|
||||
}
|
||||
|
||||
tracer_ = PrecompilerTracer::StartTracingIfRequested(this);
|
||||
|
||||
// All stubs have already been generated, all of them share the same pool.
|
||||
// We use that pool to initialize our global object pool, to guarantee
|
||||
// stubs as well as code compiled from here on will have the same pool.
|
||||
|
@ -298,8 +302,11 @@ void Precompiler::DoCompileAll() {
|
|||
CollectDynamicFunctionNames();
|
||||
|
||||
// Start with the allocations and invocations that happen from C++.
|
||||
AddRoots();
|
||||
AddAnnotatedRoots();
|
||||
{
|
||||
TracingScope scope(this);
|
||||
AddRoots();
|
||||
AddAnnotatedRoots();
|
||||
}
|
||||
|
||||
// With the nnbd experiment enabled, these non-nullable type arguments may
|
||||
// not be retained, although they will be used and expected to be
|
||||
|
@ -377,6 +384,11 @@ void Precompiler::DoCompileAll() {
|
|||
}
|
||||
}
|
||||
|
||||
if (tracer_ != nullptr) {
|
||||
tracer_->Finalize();
|
||||
tracer_ = nullptr;
|
||||
}
|
||||
|
||||
TraceForRetainedFunctions();
|
||||
FinalizeDispatchTable();
|
||||
ReplaceFunctionPCRelativeCallEntries();
|
||||
|
@ -611,38 +623,27 @@ void Precompiler::ProcessFunction(const Function& function) {
|
|||
const intptr_t gop_offset =
|
||||
FLAG_use_bare_instructions ? global_object_pool_builder()->CurrentLength()
|
||||
: 0;
|
||||
RELEASE_ASSERT(!function.HasCode());
|
||||
|
||||
if (!function.HasCode()) {
|
||||
function_count_++;
|
||||
TracingScope tracing_scope(this);
|
||||
function_count_++;
|
||||
|
||||
if (FLAG_trace_precompiler) {
|
||||
THR_Print("Precompiling %" Pd " %s (%s, %s)\n", function_count_,
|
||||
function.ToLibNamePrefixedQualifiedCString(),
|
||||
function.token_pos().ToCString(),
|
||||
Function::KindToCString(function.kind()));
|
||||
}
|
||||
|
||||
ASSERT(!function.is_abstract());
|
||||
ASSERT(!function.IsRedirectingFactory());
|
||||
|
||||
error_ = CompileFunction(this, thread_, zone_, function);
|
||||
if (!error_.IsNull()) {
|
||||
Jump(error_);
|
||||
}
|
||||
// Used in the JIT to save type-feedback across compilations.
|
||||
function.ClearICDataArray();
|
||||
} else {
|
||||
if (FLAG_trace_precompiler) {
|
||||
// This function was compiled from somewhere other than Precompiler,
|
||||
// such as const constructors compiled by the parser.
|
||||
THR_Print("Already has code: %s (%s, %s)\n",
|
||||
function.ToLibNamePrefixedQualifiedCString(),
|
||||
function.token_pos().ToCString(),
|
||||
Function::KindToCString(function.kind()));
|
||||
}
|
||||
if (FLAG_trace_precompiler) {
|
||||
THR_Print("Precompiling %" Pd " %s (%s, %s)\n", function_count_,
|
||||
function.ToLibNamePrefixedQualifiedCString(),
|
||||
function.token_pos().ToCString(),
|
||||
Function::KindToCString(function.kind()));
|
||||
}
|
||||
|
||||
ASSERT(function.HasCode());
|
||||
ASSERT(!function.is_abstract());
|
||||
ASSERT(!function.IsRedirectingFactory());
|
||||
|
||||
error_ = CompileFunction(this, thread_, zone_, function);
|
||||
if (!error_.IsNull()) {
|
||||
Jump(error_);
|
||||
}
|
||||
// Used in the JIT to save type-feedback across compilations.
|
||||
function.ClearICDataArray();
|
||||
AddCalleesOf(function, gop_offset);
|
||||
}
|
||||
|
||||
|
@ -676,7 +677,11 @@ void Precompiler::AddCalleesOf(const Function& function, intptr_t gop_offset) {
|
|||
#endif
|
||||
|
||||
String& selector = String::Handle(Z);
|
||||
if (FLAG_use_bare_instructions) {
|
||||
// When tracing we want to scan the object pool attached to the code object
|
||||
// rather than scanning global object pool - because we want to include
|
||||
// *all* outgoing references into the trace. Scanning GOP would exclude
|
||||
// references that have been deduplicated.
|
||||
if (FLAG_use_bare_instructions && !is_tracing()) {
|
||||
for (intptr_t i = gop_offset;
|
||||
i < global_object_pool_builder()->CurrentLength(); i++) {
|
||||
const auto& wrapper_entry = global_object_pool_builder()->EntryAt(i);
|
||||
|
@ -968,6 +973,10 @@ void Precompiler::AddClosureCall(const String& call_selector,
|
|||
}
|
||||
|
||||
void Precompiler::AddField(const Field& field) {
|
||||
if (is_tracing()) {
|
||||
tracer_->WriteFieldRef(field);
|
||||
}
|
||||
|
||||
if (fields_to_retain_.HasKey(&field)) return;
|
||||
|
||||
fields_to_retain_.Insert(&Field::ZoneHandle(Z, field.raw()));
|
||||
|
@ -1023,6 +1032,10 @@ bool Precompiler::MustRetainFunction(const Function& function) {
|
|||
}
|
||||
|
||||
void Precompiler::AddFunction(const Function& function, bool retain) {
|
||||
if (is_tracing()) {
|
||||
tracer_->WriteFunctionRef(function);
|
||||
}
|
||||
|
||||
if (possibly_retained_functions_.ContainsKey(function)) return;
|
||||
if (retain || MustRetainFunction(function)) {
|
||||
possibly_retained_functions_.Insert(function);
|
||||
|
@ -1042,8 +1055,11 @@ bool Precompiler::IsSent(const String& selector) {
|
|||
}
|
||||
|
||||
void Precompiler::AddSelector(const String& selector) {
|
||||
ASSERT(!selector.IsNull());
|
||||
if (is_tracing()) {
|
||||
tracer_->WriteSelectorRef(selector);
|
||||
}
|
||||
|
||||
ASSERT(!selector.IsNull());
|
||||
if (!IsSent(selector)) {
|
||||
sent_selectors_.Insert(&String::ZoneHandle(Z, selector.raw()));
|
||||
selector_count_++;
|
||||
|
@ -1059,6 +1075,10 @@ void Precompiler::AddSelector(const String& selector) {
|
|||
void Precompiler::AddTableSelector(const compiler::TableSelector* selector) {
|
||||
ASSERT(FLAG_use_bare_instructions && FLAG_use_table_dispatch);
|
||||
|
||||
if (is_tracing()) {
|
||||
tracer_->WriteTableSelectorRef(selector->id);
|
||||
}
|
||||
|
||||
if (seen_table_selectors_.HasKey(selector->id)) return;
|
||||
|
||||
seen_table_selectors_.Insert(selector->id);
|
||||
|
@ -1076,6 +1096,10 @@ bool Precompiler::IsHitByTableSelector(const Function& function) {
|
|||
}
|
||||
|
||||
void Precompiler::AddInstantiatedClass(const Class& cls) {
|
||||
if (is_tracing()) {
|
||||
tracer_->WriteClassInstantiationRef(cls);
|
||||
}
|
||||
|
||||
if (cls.is_allocated()) return;
|
||||
|
||||
class_count_++;
|
||||
|
@ -2698,6 +2722,10 @@ ErrorPtr Precompiler::CompileFunction(Precompiler* precompiler,
|
|||
ASSERT(CompilerState::Current().is_aot());
|
||||
const bool optimized = function.IsOptimizable(); // False for natives.
|
||||
DartCompilationPipeline pipeline;
|
||||
if (precompiler->is_tracing()) {
|
||||
precompiler->tracer_->WriteCompileFunctionEvent(function);
|
||||
}
|
||||
|
||||
return PrecompileFunctionHelper(precompiler, &pipeline, function, optimized);
|
||||
}
|
||||
|
||||
|
|
|
@ -25,13 +25,10 @@ class Error;
|
|||
class Field;
|
||||
class Function;
|
||||
class GrowableObjectArray;
|
||||
class SequenceNode;
|
||||
class String;
|
||||
class ParsedJSONObject;
|
||||
class ParsedJSONArray;
|
||||
class Precompiler;
|
||||
class FlowGraph;
|
||||
class PrecompilerEntryPointsPrinter;
|
||||
class PrecompilerTracer;
|
||||
|
||||
class TableSelectorKeyValueTrait {
|
||||
public:
|
||||
|
@ -249,9 +246,27 @@ class Precompiler : public ValueObject {
|
|||
|
||||
Phase phase() const { return phase_; }
|
||||
|
||||
bool is_tracing() const { return is_tracing_; }
|
||||
|
||||
private:
|
||||
static Precompiler* singleton_;
|
||||
|
||||
// Scope which activates machine readable precompiler tracing if tracer
|
||||
// is available.
|
||||
class TracingScope : public ValueObject {
|
||||
public:
|
||||
explicit TracingScope(Precompiler* precompiler)
|
||||
: precompiler_(precompiler), was_tracing_(precompiler->is_tracing_) {
|
||||
precompiler->is_tracing_ = (precompiler->tracer_ != nullptr);
|
||||
}
|
||||
|
||||
~TracingScope() { precompiler_->is_tracing_ = was_tracing_; }
|
||||
|
||||
private:
|
||||
Precompiler* const precompiler_;
|
||||
const bool was_tracing_;
|
||||
};
|
||||
|
||||
explicit Precompiler(Thread* thread);
|
||||
~Precompiler();
|
||||
|
||||
|
@ -356,6 +371,8 @@ class Precompiler : public ValueObject {
|
|||
void* il_serialization_stream_;
|
||||
|
||||
Phase phase_ = Phase::kPreparation;
|
||||
PrecompilerTracer* tracer_ = nullptr;
|
||||
bool is_tracing_ = false;
|
||||
};
|
||||
|
||||
class FunctionsTraits {
|
||||
|
|
168
runtime/vm/compiler/aot/precompiler_tracer.cc
Normal file
168
runtime/vm/compiler/aot/precompiler_tracer.cc
Normal file
|
@ -0,0 +1,168 @@
|
|||
// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
|
||||
// for details. All rights reserved. Use of this source code is governed by a
|
||||
// BSD-style license that can be found in the LICENSE file.
|
||||
|
||||
#include "vm/compiler/aot/precompiler_tracer.h"
|
||||
|
||||
#include "vm/compiler/aot/precompiler.h"
|
||||
#include "vm/zone_text_buffer.h"
|
||||
|
||||
namespace dart {
|
||||
|
||||
#if defined(DART_PRECOMPILER)
|
||||
|
||||
DEFINE_FLAG(charp,
|
||||
trace_precompiler_to,
|
||||
nullptr,
|
||||
"Output machine readable precompilation trace into the given file");
|
||||
|
||||
PrecompilerTracer* PrecompilerTracer::StartTracingIfRequested(
|
||||
Precompiler* precompiler) {
|
||||
if (FLAG_trace_precompiler_to != nullptr &&
|
||||
Dart::file_write_callback() != nullptr &&
|
||||
Dart::file_open_callback() != nullptr &&
|
||||
Dart::file_close_callback() != nullptr) {
|
||||
return new PrecompilerTracer(
|
||||
precompiler, Dart::file_open_callback()(FLAG_trace_precompiler_to,
|
||||
/*write=*/true));
|
||||
}
|
||||
return nullptr;
|
||||
}
|
||||
|
||||
PrecompilerTracer::PrecompilerTracer(Precompiler* precompiler, void* stream)
|
||||
: zone_(Thread::Current()->zone()),
|
||||
precompiler_(precompiler),
|
||||
stream_(stream),
|
||||
strings_(HashTables::New<StringTable>(1024)),
|
||||
entities_(HashTables::New<EntityTable>(1024)),
|
||||
object_(Object::Handle()),
|
||||
cls_(Class::Handle()) {
|
||||
Write("{\"trace\":[\"R\",");
|
||||
}
|
||||
|
||||
void PrecompilerTracer::Finalize() {
|
||||
Write("\"E\"],");
|
||||
WriteEntityTable();
|
||||
Write(",");
|
||||
WriteStringTable();
|
||||
Write("}\n");
|
||||
Dart::file_close_callback()(stream_);
|
||||
|
||||
strings_.Release();
|
||||
entities_.Release();
|
||||
}
|
||||
|
||||
void PrecompilerTracer::WriteEntityTable() {
|
||||
Write("\"entities\":[");
|
||||
const auto& entities_by_id =
|
||||
Array::Handle(zone_, Array::New(entities_.NumOccupied()));
|
||||
|
||||
EntityTable::Iterator it(&entities_);
|
||||
while (it.MoveNext()) {
|
||||
object_ = entities_.GetPayload(it.Current(), 0);
|
||||
const intptr_t index = Smi::Cast(object_).Value();
|
||||
object_ = entities_.GetKey(it.Current());
|
||||
entities_by_id.SetAt(index, object_);
|
||||
}
|
||||
|
||||
auto& obj = Object::Handle(zone_);
|
||||
auto& lib = Library::Handle(zone_);
|
||||
auto& str = String::Handle(zone_);
|
||||
for (intptr_t i = 0; i < entities_by_id.Length(); i++) {
|
||||
if (i > 0) {
|
||||
Write(",");
|
||||
}
|
||||
obj = entities_by_id.At(i);
|
||||
if (obj.IsFunction()) {
|
||||
const auto& fun = Function::Cast(obj);
|
||||
cls_ = fun.Owner();
|
||||
const intptr_t selector_id =
|
||||
FLAG_use_bare_instructions && FLAG_use_table_dispatch
|
||||
? precompiler_->selector_map()->SelectorId(fun)
|
||||
: -1;
|
||||
Write("\"%c\",%" Pd ",%" Pd ",%" Pd "",
|
||||
fun.IsDynamicFunction() ? 'F' : 'S', InternEntity(cls_),
|
||||
InternString(NameForTrace(fun)), selector_id);
|
||||
} else if (obj.IsField()) {
|
||||
const auto& field = Field::Cast(obj);
|
||||
cls_ = field.Owner();
|
||||
str = field.name();
|
||||
Write("\"V\",%" Pd ",%" Pd ",0", InternEntity(cls_), InternString(str));
|
||||
} else if (obj.IsClass()) {
|
||||
const auto& cls = Class::Cast(obj);
|
||||
lib = cls.library();
|
||||
str = lib.url();
|
||||
const auto url_id = InternString(str);
|
||||
str = cls.ScrubbedName();
|
||||
const auto name_id = InternString(str);
|
||||
Write("\"C\",%" Pd ",%" Pd ",0", url_id, name_id);
|
||||
} else {
|
||||
UNREACHABLE();
|
||||
}
|
||||
}
|
||||
Write("]");
|
||||
}
|
||||
|
||||
void PrecompilerTracer::WriteStringTable() {
|
||||
Write("\"strings\":[");
|
||||
GrowableArray<const char*> strings_by_id(strings_.NumOccupied());
|
||||
strings_by_id.EnsureLength(strings_.NumOccupied(), nullptr);
|
||||
StringTable::Iterator it(&strings_);
|
||||
while (it.MoveNext()) {
|
||||
object_ = strings_.GetPayload(it.Current(), 0);
|
||||
const auto index = Smi::Cast(object_).Value();
|
||||
object_ = strings_.GetKey(it.Current());
|
||||
strings_by_id[index] = String::Cast(object_).ToCString();
|
||||
}
|
||||
auto comma = false;
|
||||
for (auto str : strings_by_id) {
|
||||
Write("%s\"%s\"", comma ? "," : "", str);
|
||||
comma = true;
|
||||
}
|
||||
Write("]");
|
||||
}
|
||||
|
||||
intptr_t PrecompilerTracer::InternString(const CString& cstr) {
|
||||
object_ = Smi::New(strings_.NumOccupied());
|
||||
object_ = strings_.InsertNewOrGetValue(cstr, object_);
|
||||
return Smi::Cast(object_).Value();
|
||||
}
|
||||
|
||||
intptr_t PrecompilerTracer::InternString(const String& str) {
|
||||
object_ = Smi::New(strings_.NumOccupied());
|
||||
object_ = strings_.InsertOrGetValue(str, object_);
|
||||
return Smi::Cast(object_).Value();
|
||||
}
|
||||
|
||||
intptr_t PrecompilerTracer::InternEntity(const Object& obj) {
|
||||
ASSERT(obj.IsFunction() || obj.IsClass() || obj.IsField());
|
||||
const auto num_occupied = entities_.NumOccupied();
|
||||
object_ = Smi::New(num_occupied);
|
||||
object_ = entities_.InsertOrGetValue(obj, object_);
|
||||
const auto id = Smi::Cast(object_).Value();
|
||||
if (id == num_occupied) {
|
||||
cls_ = Class::null();
|
||||
if (obj.IsFunction()) {
|
||||
cls_ = Function::Cast(obj).Owner();
|
||||
} else if (obj.IsField()) {
|
||||
cls_ = Field::Cast(obj).Owner();
|
||||
}
|
||||
if (cls_.raw() != Class::null()) {
|
||||
InternEntity(cls_);
|
||||
}
|
||||
}
|
||||
return id;
|
||||
}
|
||||
|
||||
PrecompilerTracer::CString PrecompilerTracer::NameForTrace(const Function& f) {
|
||||
ZoneTextBuffer buffer(zone_);
|
||||
f.PrintName(NameFormattingParams::DisambiguatedWithoutClassName(
|
||||
Object::NameVisibility::kInternalName),
|
||||
&buffer);
|
||||
return {buffer.buffer(), buffer.length(),
|
||||
String::Hash(buffer.buffer(), buffer.length())};
|
||||
}
|
||||
|
||||
#endif // defined(DART_PRECOMPILER)
|
||||
|
||||
} // namespace dart
|
145
runtime/vm/compiler/aot/precompiler_tracer.h
Normal file
145
runtime/vm/compiler/aot/precompiler_tracer.h
Normal file
|
@ -0,0 +1,145 @@
|
|||
// Copyright (c) 2020, the Dart project authors. Please see the AUTHORS file
|
||||
// for details. All rights reserved. Use of this source code is governed by a
|
||||
// BSD-style license that can be found in the LICENSE file.
|
||||
|
||||
#ifndef RUNTIME_VM_COMPILER_AOT_PRECOMPILER_TRACER_H_
|
||||
#define RUNTIME_VM_COMPILER_AOT_PRECOMPILER_TRACER_H_
|
||||
|
||||
#if defined(DART_PRECOMPILED_RUNTIME)
|
||||
#error "AOT runtime should not use compiler sources (including header files)"
|
||||
#endif // defined(DART_PRECOMPILED_RUNTIME)
|
||||
|
||||
#include "vm/allocation.h"
|
||||
#include "vm/hash_table.h"
|
||||
#include "vm/symbols.h"
|
||||
|
||||
namespace dart {
|
||||
|
||||
// Forward declarations.
|
||||
class Precompiler;
|
||||
|
||||
#if defined(DART_PRECOMPILER)
|
||||
// Tracer which produces machine readable precompiler tracer, which captures
|
||||
// information about all compiled functions and dependencies between them.
|
||||
// See pkg/vm_snapshot_analysis/README.md for the definition of the
|
||||
// format.
|
||||
class PrecompilerTracer : public ZoneAllocated {
|
||||
public:
|
||||
static PrecompilerTracer* StartTracingIfRequested(Precompiler* precompiler);
|
||||
|
||||
void Finalize();
|
||||
|
||||
void WriteEntityRef(const Object& field) {
|
||||
Write("%" Pd ",", InternEntity(field));
|
||||
}
|
||||
|
||||
void WriteFieldRef(const Field& field) { WriteEntityRef(field); }
|
||||
|
||||
void WriteFunctionRef(const Function& function) { WriteEntityRef(function); }
|
||||
|
||||
void WriteSelectorRef(const String& selector) {
|
||||
Write("\"S\",%" Pd ",", InternString(selector));
|
||||
}
|
||||
|
||||
void WriteTableSelectorRef(intptr_t id) { Write("\"T\",%" Pd ",", id); }
|
||||
|
||||
void WriteClassInstantiationRef(const Class& cls) { WriteEntityRef(cls); }
|
||||
|
||||
void WriteCompileFunctionEvent(const Function& function) {
|
||||
Write("\"C\",");
|
||||
WriteEntityRef(function);
|
||||
}
|
||||
|
||||
private:
|
||||
struct CString {
|
||||
const char* str;
|
||||
const intptr_t length;
|
||||
intptr_t hash;
|
||||
};
|
||||
|
||||
struct StringTableTraits {
|
||||
static bool ReportStats() { return false; }
|
||||
static const char* Name() { return "StringTableTraits"; }
|
||||
|
||||
static bool IsMatch(const Object& a, const Object& b) {
|
||||
return String::Cast(a).Equals(String::Cast(b));
|
||||
}
|
||||
|
||||
static bool IsMatch(const CString& cstr, const Object& other) {
|
||||
const String& other_str = String::Cast(other);
|
||||
if (other_str.Hash() != cstr.hash) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (other_str.Length() != cstr.length) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return other_str.Equals(cstr.str);
|
||||
}
|
||||
|
||||
static uword Hash(const CString& cstr) { return cstr.hash; }
|
||||
|
||||
static uword Hash(const Object& obj) { return String::Cast(obj).Hash(); }
|
||||
|
||||
static ObjectPtr NewKey(const CString& cstr) {
|
||||
return Symbols::New(Thread::Current(), cstr.str);
|
||||
}
|
||||
};
|
||||
|
||||
struct EntityTableTraits {
|
||||
static bool ReportStats() { return false; }
|
||||
static const char* Name() { return "EntityTableTraits"; }
|
||||
|
||||
static bool IsMatch(const Object& a, const Object& b) {
|
||||
return a.raw() == b.raw();
|
||||
}
|
||||
|
||||
static uword Hash(const Object& obj) {
|
||||
if (obj.IsFunction()) {
|
||||
return Function::Cast(obj).Hash();
|
||||
} else if (obj.IsClass()) {
|
||||
return String::HashRawSymbol(Class::Cast(obj).Name());
|
||||
} else if (obj.IsField()) {
|
||||
return String::HashRawSymbol(Field::Cast(obj).name());
|
||||
}
|
||||
return obj.GetClassId();
|
||||
}
|
||||
};
|
||||
|
||||
using StringTable = UnorderedHashMap<StringTableTraits>;
|
||||
using EntityTable = UnorderedHashMap<EntityTableTraits>;
|
||||
|
||||
PrecompilerTracer(Precompiler* precompiler, void* stream);
|
||||
|
||||
intptr_t InternString(const CString& cstr);
|
||||
intptr_t InternString(const String& str);
|
||||
intptr_t InternEntity(const Object& obj);
|
||||
|
||||
void Write(const char* format, ...) PRINTF_ATTRIBUTE(2, 3) {
|
||||
va_list va;
|
||||
va_start(va, format);
|
||||
const char* line = OS::VSCreate(zone_, format, va);
|
||||
Dart::file_write_callback()(line, strlen(line), stream_);
|
||||
va_end(va);
|
||||
}
|
||||
|
||||
CString NameForTrace(const Function& f);
|
||||
|
||||
void WriteEntityTable();
|
||||
void WriteStringTable();
|
||||
|
||||
Zone* zone_;
|
||||
Precompiler* precompiler_;
|
||||
void* stream_;
|
||||
StringTable strings_;
|
||||
EntityTable entities_;
|
||||
|
||||
Object& object_;
|
||||
Class& cls_;
|
||||
};
|
||||
#endif // defined(DART_PRECOMPILER)
|
||||
|
||||
} // namespace dart
|
||||
|
||||
#endif // RUNTIME_VM_COMPILER_AOT_PRECOMPILER_TRACER_H_
|
|
@ -11,6 +11,8 @@ compiler_sources = [
|
|||
"aot/dispatch_table_generator.h",
|
||||
"aot/precompiler.cc",
|
||||
"aot/precompiler.h",
|
||||
"aot/precompiler_tracer.cc",
|
||||
"aot/precompiler_tracer.h",
|
||||
"asm_intrinsifier.cc",
|
||||
"asm_intrinsifier.h",
|
||||
"asm_intrinsifier_arm.cc",
|
||||
|
|
|
@ -15964,6 +15964,7 @@ CodePtr Code::FinalizeCodeAndNotify(const char* name,
|
|||
|
||||
#if defined(DART_PRECOMPILER)
|
||||
DECLARE_FLAG(charp, write_v8_snapshot_profile_to);
|
||||
DECLARE_FLAG(charp, trace_precompiler_to);
|
||||
#endif // defined(DART_PRECOMPILER)
|
||||
|
||||
CodePtr Code::FinalizeCode(FlowGraphCompiler* compiler,
|
||||
|
@ -15985,12 +15986,13 @@ CodePtr Code::FinalizeCode(FlowGraphCompiler* compiler,
|
|||
}
|
||||
} else {
|
||||
#if defined(DART_PRECOMPILER)
|
||||
if (FLAG_write_v8_snapshot_profile_to != nullptr &&
|
||||
assembler->HasObjectPoolBuilder() &&
|
||||
const bool needs_pool = (FLAG_write_v8_snapshot_profile_to != nullptr) ||
|
||||
(FLAG_trace_precompiler_to != nullptr);
|
||||
if (needs_pool && assembler->HasObjectPoolBuilder() &&
|
||||
assembler->object_pool_builder().HasParent()) {
|
||||
// We are not going to write this pool into snapshot, but we will use
|
||||
// it to emit references from code object to other objects in the
|
||||
// snapshot that it caused to be added to the pool.
|
||||
// it to emit references from this code object to other objects in the
|
||||
// snapshot that it uses.
|
||||
object_pool =
|
||||
ObjectPool::NewFromBuilder(assembler->object_pool_builder());
|
||||
}
|
||||
|
|
|
@ -414,6 +414,7 @@ void ProgramVisitor::BindStaticCalls(Zone* zone, Isolate* isolate) {
|
|||
WalkProgram(zone, isolate, &visitor);
|
||||
}
|
||||
|
||||
DECLARE_FLAG(charp, trace_precompiler_to);
|
||||
DECLARE_FLAG(charp, write_v8_snapshot_profile_to);
|
||||
|
||||
void ProgramVisitor::ShareMegamorphicBuckets(Zone* zone, Isolate* isolate) {
|
||||
|
@ -913,7 +914,8 @@ void ProgramVisitor::DedupUnlinkedCalls(Zone* zone, Isolate* isolate) {
|
|||
// implicit and go through global object pool). This information is needed
|
||||
// to produce more informative snapshot profile.
|
||||
if (!FLAG_use_bare_instructions ||
|
||||
FLAG_write_v8_snapshot_profile_to != nullptr) {
|
||||
FLAG_write_v8_snapshot_profile_to != nullptr ||
|
||||
FLAG_trace_precompiler_to != nullptr) {
|
||||
WalkProgram(zone, isolate, &deduper);
|
||||
}
|
||||
}
|
||||
|
|
Loading…
Reference in a new issue