Skip to content

Commit 42da84f

Browse files
author
spupyrev
committed
[BOLT] Always match stale entry blocks
Two (minor) improvements for stale matching: - always match entry blocks to each other, even if there is a hash mismatch; - ignore nops in (loose) hash computation. I record a small improvement in inference quality on my benchmarks. Tests are not affected Reviewed By: Amir Differential Revision: https://reviews.llvm.org/D159488
1 parent adfd4a2 commit 42da84f

File tree

2 files changed

+5
-1
lines changed

2 files changed

+5
-1
lines changed

bolt/lib/Core/HashUtilities.cpp

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,8 @@ std::string hashBlockLoose(BinaryContext &BC, const BinaryBasicBlock &BB) {
139139
// instruction opcodes, which is then hashed with std::hash.
140140
std::set<std::string> Opcodes;
141141
for (const MCInst &Inst : BB) {
142-
if (BC.MIB->isPseudo(Inst))
142+
// Skip pseudo instructions and nops.
143+
if (BC.MIB->isPseudo(Inst) || BC.MIB->isNoop(Inst))
143144
continue;
144145

145146
// Ignore unconditional jumps, as they can be added / removed as a result

bolt/lib/Profile/StaleProfileMatching.cpp

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -389,6 +389,9 @@ void matchWeightsByHashes(BinaryContext &BC,
389389
assert(YamlBB.Hash != 0 && "empty hash of BinaryBasicBlockProfile");
390390
BlendedBlockHash YamlHash(YamlBB.Hash);
391391
const FlowBlock *MatchedBlock = Matcher.matchBlock(YamlHash);
392+
// Always match the entry block.
393+
if (MatchedBlock == nullptr && YamlBB.Index == 0)
394+
MatchedBlock = Blocks[0];
392395
if (MatchedBlock != nullptr) {
393396
MatchedBlocks[YamlBB.Index] = MatchedBlock;
394397
BlendedBlockHash BinHash = BlendedHashes[MatchedBlock->Index - 1];

0 commit comments

Comments
 (0)