linux/arch/powerpc/lib
Paul Mackerras a7c81ce398 powerpc/64: Make exception table clearer in __copy_tofrom_user_base
This aims to make the generation of exception table entries for the
loads and stores in __copy_tofrom_user_base clearer and easier to
verify.  Instead of having a series of local labels on the loads and
stores, with a series of corresponding labels later for the exception
handlers, we now use macros to generate exception table entries at the
point of each load and store that could potentially trap.  We do this
with the macros lex (load exception) and stex (store exception).
These macros are used right before the load or store to which they
apply.

Some complexity is introduced by the fact that we have some more work
to do after hitting an exception, because we need to calculate and
return the number of bytes not copied.  The code uses r3 as the
current pointer into the destination buffer, that is, the address of
the first byte of the destination that has not been modified.
However, at various points in the copy loops, r3 can be 4, 8, 16 or 24
bytes behind that point.

To express this offset in an understandable way, we define a symbol
r3_offset which is updated at various points so that it equal to the
difference between the address of the first unmodified byte of the
destination and the value in r3.  (In fact it only needs to be
accurate at the point of each lex or stex macro invocation.)

The rules for updating r3_offset are as follows:

* It starts out at 0
* An addi r3,r3,N instruction decreases r3_offset by N
* A store instruction (stb, sth, stw, std) to N(r3)
  increases r3_offset by the width of the store (1, 2, 4, 8)
* A store with update instruction (stbu, sthu, stwu, stdu) to N(r3)
  sets r3_offset to the width of the store.

There is some trickiness to the way that the lex and stex macros and
the associated exception handlers work.  I would have liked to use
the current value of r3_offset in the name of the symbol used as
the exception handler, as in ".Lld_exc_$(r3_offset)" and then
have symbols .Lld_exc_0, .Lld_exc_8, .Lld_exc_16 etc. corresponding
to the offsets that needed to be added to r3.  However, I couldn't
see a way to do that with gas.

Instead, the exception handler address is .Lld_exc - r3_offset or
.Lst_exc - r3_offset, that is, the distance ahead of .Lld_exc/.Lst_exc
that we start executing is equal to the amount that we need to add to
r3.  This works because r3_offset is always a small multiple of 4,
and our instructions are 4 bytes long.  This means that before
.Lld_exc and .Lst_exc, we have a sequence of instructions that
increments r3 by 4, 8, 16 or 24 depending on where we start.  The
sequence increments r3 by 4 per instruction (on average).

We also replace the exception table for the 4k copy loop by a
macro per load or store.  These loads and stores all use exactly
the same exception handler, which simply resets the argument registers
r3, r4 and r5 to there original values and re-does the whole copy
using the slower loop.

Signed-off-by: Paul Mackerras <paulus@ozlabs.org>
Signed-off-by: Michael Ellerman <mpe@ellerman.id.au>
2018-08-08 00:32:34 +10:00
..
alloc.c License cleanup: add SPDX GPL-2.0 license identifier to files with no license 2017-11-02 11:10:55 +01:00
checksum_32.S powerpc: Implement csum_ipv6_magic in assembly 2018-06-04 00:39:19 +10:00
checksum_64.S powerpc: Implement csum_ipv6_magic in assembly 2018-06-04 00:39:19 +10:00
checksum_wrappers.c Replace <asm/uaccess.h> with <linux/uaccess.h> globally 2016-12-24 11:46:01 -08:00
code-patching.c powerpc/asm: Add a patch_site macro & helpers for patching instructions 2018-08-08 00:32:25 +10:00
copy_32.S powerpc/32: remove a NOP from memset() 2017-09-01 16:42:46 +10:00
copypage_64.S powerpc: clean inclusions of asm/feature-fixups.h 2018-07-30 22:48:17 +10:00
copypage_power7.S powerpc/64: enhance memcmp() with VMX instruction for long bytes comparision 2018-07-24 22:03:21 +10:00
copyuser_64.S powerpc/64: Make exception table clearer in __copy_tofrom_user_base 2018-08-08 00:32:34 +10:00
copyuser_power7.S powerpc/64s: Set assembler machine type to POWER4 2018-04-01 00:47:49 +11:00
crtsavres.S powerpc/64: Do not create new section for save/restore functions 2017-05-30 14:59:51 +10:00
div64.S
feature-fixups-test.S powerpc: move ASM_CONST and stringify_in_c() into asm-const.h 2018-07-30 22:48:16 +10:00
feature-fixups.c powerpc/fsl: Add barrier_nospec implementation for NXP PowerPC Book3E 2018-08-08 00:32:24 +10:00
hweight_64.S powerpc: clean inclusions of asm/feature-fixups.h 2018-07-30 22:48:17 +10:00
ldstfp.S powerpc: move ASM_CONST and stringify_in_c() into asm-const.h 2018-07-30 22:48:16 +10:00
locks.c powerpc: clean the inclusion of stringify.h 2018-07-30 22:48:17 +10:00
Makefile powerpc/lib: Implement strlen() in assembly for PPC32 2018-08-07 21:49:30 +10:00
mem_64.S powerpc/string: Implement optimized memset variants 2017-08-17 23:04:35 +10:00
memcmp_32.S powerpc/lib: optimise PPC32 memcmp 2018-06-04 00:39:21 +10:00
memcmp_64.S powerpc/64: add 32 bytes prechecking before using VMX optimization on memcmp() 2018-07-24 22:03:21 +10:00
memcpy_64.S powerpc: clean inclusions of asm/feature-fixups.h 2018-07-30 22:48:17 +10:00
memcpy_power7.S powerpc/64: enhance memcmp() with VMX instruction for long bytes comparision 2018-07-24 22:03:21 +10:00
pmem.c powerpc/lib: Implement UACCESS_FLUSHCACHE API 2017-11-13 08:00:31 +11:00
quad.S powerpc: Handle most loads and stores in instruction emulation code 2017-09-01 16:39:48 +10:00
rheap.c treewide: kmalloc() -> kmalloc_array() 2018-06-12 16:19:22 -07:00
sstep.c powerpc/sstep: Fix kernel crash if VSX is not present 2018-06-04 00:39:08 +10:00
string.S powerpc/lib: optimise PPC32 memcmp 2018-06-04 00:39:21 +10:00
string_32.S powerpc/lib: optimise 32 bits __clear_user() 2018-06-04 00:39:21 +10:00
string_64.S powerpc: Fix invalid use of register expressions 2017-08-10 22:29:41 +10:00
strlen_32.S powerpc/lib: Implement strlen() in assembly for PPC32 2018-08-07 21:49:30 +10:00
test_emulate_step.c powerpc/sstep: Fix emulate_step test if VSX not present 2018-06-04 00:39:14 +10:00
vmx-helper.c powerpc/64: enhance memcmp() with VMX instruction for long bytes comparision 2018-07-24 22:03:21 +10:00
xor_vmx.c powerpc/lib/xor_vmx: Ensure no altivec code executes before enable_kernel_altivec() 2017-06-02 20:17:52 +10:00
xor_vmx.h License cleanup: add SPDX GPL-2.0 license identifier to files with no license 2017-11-02 11:10:55 +01:00
xor_vmx_glue.c powerpc/altivec: Add missing prototypes for altivec 2018-05-25 12:04:38 +10:00