HTTP2 (HPACK) How to derive dynamic decompression table index from the payload - http2

"Request Message 1" is using static table index 31 to send content-type information. Then the entry is added to dynamic table with index value 63. How to derive the dynamic table index value from "Request Message 1"?
Request message 1:
Header: content-type: multipart/related; boundary=++Boundary
Name Length: 12
Name: content-type
Value Length: 38
Value: multipart/related; boundary=++Boundary
content-type: multipart/related; boundary=++Boundary
[Unescaped: multipart/related; boundary=++Boundary]
Representation: Literal Header Field with Incremental Indexing - Indexed Name
Index: 31
Hex dump
5f 9d a6 da 12 6a c7 62 58 b0 b4 0d 25 93 ed 48
cf 6d 52 0e cf 50 7f bf f7 74 f6 d5 20 ec f5
Request message 2:
Header: content-type: multipart/related; boundary=++Boundary
Name Length: 12
Name: content-type
Value Length: 38
Value: multipart/related; boundary=++Boundary
content-type: multipart/related; boundary=++Boundary
[Unescaped: multipart/related; boundary=++Boundary]
Representation: Indexed Header Field
Index: 63
Hex dump : 0xbf (dynamic table index value)

If I understand you right: Your first request marked as "Field with Incremental Indexing". That's mean that this header also had this index in static or dynamic table and it must be added to dynamic table(because it has other value). Dynamic table's first index is 62. It's because static table ends on 61. When header added to dynamic table he get to the top - 62 index (RFC7541-2.3). I will assume that you did not show us the whole request, most likely it had another incremental header, which took a position above this.

MadBard is right. The hex dump of the header shows the first octet is 5f = 01011111.
According to RFC 7541 "6.2.1. Literal Header Field with Incremental Indexing," the first two bits - 01 - indicate the header field is a new one that should be appended to the dynamic table. Since the next 6 bits (011111) are not all 0, they are referencing a header name in the static table. 011111 is the index of the header name to be used in the new header field. 011111 is 31, so it takes the header name at index 31 of the static table, which is "content-type" (see "Appendix A. Static Table Definition" of RFC 7541). The value for the header field is thus composed of the static table name (content-type), and the value which is carried over-the-wire in Request 1. (The value is also Huffman encoded to save a few bytes, which is why we can't read the ASCII directly from the hex dump). This new header is then appended to index 62 of the lookup table. The indices of all previous entries in the dynamic portion of the lookup table are incremented by one (e.g. the previous 62 becomes 63, since it is a FIFO queue).
Another header was added to the dynamic portion of the lookup table after the one of interest, since we can see that the lookup index in Request 2 is 63, not 62, and thus it was bumped up 1 since it was added. If you were to keep monitoring as more headers were added, you would see the index of this particular header would keep incrementing. Eventually it would get evicted from the lookup table when the dynamic table size gets exhausted.

Related

Why BER-TLV "DF9A" tag is recognized as "invalid"?

I have problem with understanding why all the BER-TLV parsers I found:
https://paymentcardtools.com/emv-tlv-parser
https://emvlab.org/tlvutils/
https://chrome.google.com/webstore/detail/tlv-parser/iemijfhdlipdpcjfnphcdalpccnkfedb
Recognize this tag: DF9A03001736 as "invalid", while: DF5603001736 and DF0903001736 work just fine.
What's the difference?
just follow the description provided in EMV Book 3, Annex B1
"invalid" case: DF9A03001736
DF - 1101 1111
9A - 1001 1010 - here, in the second byte of the Tag, the b8 is set (1), which means that 'Another byte follows', so the following byte (value 03) is also part of the Tag
03 - 0000 0011 - the last byte of the Tag, i.e. the actual Tag is DF9A03
so, in our sequence we have:
DF9A03 - Tag
00 - Length (no value)
17 - is already a new Tag
36 - length of the Tag 17 ...
So the parser (https://paymentcardtools.com/emv-tlv-parser) fails because no data is available (Error during parsing Tag 17: Not enough data)
correct example: DF5603001736
DF - 1101 1111
56 - 0101 0110 - there are no more bytes that constitute Tag, so we just have Tag DF56
the sequence is:
DF56 - Tag
03 - Length
001736 - Value

Error: syntax error, unexpected "Identifier", expecting EXTERNAL or GLOBAL

Hi I was wondering if yall could help me figure this error out. Im rather new to cobol as it is my first (and only) cobol class in my major.
I keep getting this error lab3a.cob:23: Error: syntax error, unexpected "Identifier", expecting EXTERNAL or GLOBAL
whenever I try to compile. And I cant seem to see what I'm doing wrong.
My Code
IDENTIFICATION DIVISION.
PROGRAM-ID. "LAB3A".
Author. Fielding Featherston
* Takes inputs from file and seperates.
ENVIRONMENT DIVISION.
INPUT-OUTPUT SECTION.
FILE-CONTROL.
SELECT InFile
ASSIGN to "lab3-in.dat"
ORGANIZATION is LINE SEQUENTIAL.
DATA DIVISION.
FILE SECTION.
FD InFile.
01 InString.
05 PIC X(13).
05 Instrument PIC X(12).
88 Brass value "Bugle" "Flugelhorn"
"Sousaphone" "Trombone"
"Trumpet" "Tuba".
88 Percussion value "Bass Drum" "Bells" "Bongos"
"Castanets" "Chimes" "Cymbals"
"Snare Drum" "Xylophone".
88 Strings value "Banjo" "Bass" "Cello" "Guitar"
"Harp" "Lyre"
"Mandolin" "Violin".
88 Woodwind value "Bagpipes" "Bassoon" "Clarinet"
"Flute" "Oboe"
"Piccolo" "Saxophone".
WORKING-STORAGE SECTION.
01 BrassCount PIC 9(3).
01 PerCount PIC 9(3).
01 StringCount PIC 9(3).
01 WoodCount PIC 9(3).
01 OtherCount PIC 9(3).
01 BrassStr PIC ZZ9.
01 PerStr PIC ZZ9.
01 StringStr PIC ZZ9.
01 WoodStr PIC ZZ9.
01 OtherStr PIC ZZ9.
01 InStringLength PIC 99.
01 EndFileStr PIC X VALUE "n".
88 EndFile VALUE "y"
When Set to False is "y".
PROCEDURE DIVISION.
000-Main.
Open Input InFile
Perform until EndFile
Read InFile
At end
Set EndFile to FALSE
Not at End
PERFORM 100-SeperateStrings
PERFORM 200-ClassCount
END-READ
END-PERFORM
CLOSE InFile
Move BrassCount to BrassStr
Move PerCount to PerStr
Move StringCount to StringStr
Move WoodCount to WoodStr
Move OtherCount to OtherStr
DISPLAY "Counts"
DISPLAY " Brass: " FUNCTION TRIM(BrassStr)
DISPLAY " Percussion: " FUNCTION TRIM(PerStr)
DISPLAY " String: " FUNCTION TRIM(StringStr)
DISPLAY " Woodwind: " FUNCTION TRIM(WoodStr)
DISPLAY " OTHER: " FUNCTION TRIM(OtherStr)
STOP RUN.
100-SeperateStrings.
MOVE FUNCTION Length(InString) to InStringLength
UNSTRING InString (14:InStringLength)
INTO Instrument
END-UNSTRING.
200-ClassCount.
IF Brass
Add 1 to BrassCount
ELSE IF Percussion
Add 1 to PerCount
ELSE IF Strings
Add 1 to StringCount
ELSE IF Woodwind
Add 1 to WoodCount
ELSE
Add 1 to OtherCount
END-IF.
An EXTERNAL or GLOBAL clause in the context of the error may only occur in a record description entry; that is, a data entry that begins with 1 or 01. Given that the error occurs between two 88 level items, it appears the compiler is confused about where it is while scanning the source code.
There is some unusual formatting that may be creating a problem with an the compiler. In particular, line 22 contains a number of TAB characters that should not, but may, confuse the compiler. Also, lines 33 and 46 contain a number of TAB characters at the end of each source line causing the lines to exceed 72 characters.
Another possible issue is expansion of tabs, whether each TAB character is replaced by 4 or 8 spaces by the compiler. Again this will affect whether the text exceeds 72 characters. In the absence of a SOURCE FORMAT directive, source text after column 72 is ignored.
Until you know the effect that tabs have on the source code, I suggest replacing all tabs with spaces.

Use null.Time values in a golang template

I'm using gopkg.in/guregu/null.v4 to get some data from a Postgres DB and the results are coming back fine, and I can put them into json format and the world is happy... however, I'm trying to email the results using a template and have hit a problem.
The structure is (partially)
type DataQuery struct {
Date null.Time `json:"DateTime"`
....
The template is
{{define "plainBody"}}
Hi,
Here are the results for the check run for today.
The number of rows returned is {{.Rows}}
The data is
{{ range .Data}}
{{.Date}}
{{end}}
{{end}}
And the results of running that template are
Hi,
Here are the results for the check run for today.
The number of rows returned is 57
The data is
{{2021-09-13 00:00:00 +0000 +0000 true}}
{{2021-08-16 00:00:00 +0000 +0000 true}}
{{2021-09-19 00:00:00 +0000 +0000 true}}
{{2021-09-18 00:00:00 +0000 +0000 true}}
I tried using {{.Date.EncodeText}} and ended up with
[50 48 50 49 45 48 57 45 49 51 84 48 48 58 48 48 58 48 48 90]
[50 48 50 49 45 48 56 45 49 54 84 48 48 58 48 48 58 48 48 90]
[50 48 50 49 45 48 57 45 49 57 84 48 48 58 48 48 58 48 48 90]
For the datetime fields (which might be a []byte of the strings but I'm not sure.
If I use {{Date.Value}} I get
2021-09-13 00:00:00 +0000 +0000
The other field types (string, int, float) all work fine with
{{Variable.ValueOrZero}}
I think I'm close.. but can't quite crack it for the date time fields
First, you are using html/template which provides context-sensitive escaping, that's why you're seeing those + sequences. If you want text output, use text/template instead. For details, see Template unnecessarily escaping `<` to `<` but not `>`
Next, null.Time is not just a simple time.Time value, it wraps other fields too (whether the time is valid). When simply outputting it, that valid field will also be rendered (the true texts in your output).
You may render only its Time field: {{.Date.Time}}.
With these changes output will be for example:
Hi,
Here are the results for the check run for today.
The number of rows returned is 2
The data is
2021-09-20 12:10:00 +0000 UTC
2021-10-11 13:50:00 +0000 UTC
Try it on the Go Playground.

getting extra bytes 82 00 in pc/sc response

I am trying to read data from sony felica card using pc/sc transparent session and transceive data object.
The response I am getting is for a read without encryption command is
c0 03 00 90 00 92 01 00 96 02 00 00 97 82 00 + Data
But according to the protocol, the response should be
c0 03 00 90 00 92 01 00 96 02 00 00 97 + Data
I am unable to figure out the last 82 00 appended in the response from the card.
Now when I try to authenticate with the card I get
c0 03 01 6F 01 90 00
which is a error in pc/sc. I want to resolve these extra bytes 82 00 which I believe will solve the issue with all the commands which require authentication and encryption.
The response data is BER-TLV encoded (see PC/SC 2.02, Part 3).
In BER-TLV encoding there are several possibilities to encode tag 0x97 with two octets of data 0xD0D1, e.g.:
97|02|D0D1 -- short form (see parsed)
97|8102|D0D1 -- long form with one octet with length (see parsed)
97|820002|D0D1 -- long form with two octets with length (see parsed)
97|83000002|D0D1 -- long form with three octets with length (see parsed)
...
Your reader is using two octets for sending the length of ICC Response data object (which is perfectly valid).
You should parse the response properly...Good luck!
PS: The above means, that the Data part of your truncated responses still contains one extra byte with the response length (i.e. Len|Data)

Pig Scripting - Cast STRING to INT

Beginner in Pig, Need help
For all NON - AlphaNumeric, Cast the STRING TO INT
- To be handled without passing each field name separately.
Sample data -
00013425731998101620140402300032736901 00000000AAA001200X111685V00000000
00283335542006120920131010300030003105 00000000AAA001200X117407 00000000
00000000331998101620140402300033128107 00000000AAA001200X111685 00000000
00003902331999090620140402300032545208 00000000AAA001200X111685 00000000
Its a fixedwidth file, mapping details as follow -
orderNumber 1 9
origin 10 10
Startdate 11 18
ModDate 19 26
Identifier 27 36
Code 37 38
CodeType 39 40
Number 41 48
Num 49 114
Either use substr to extract the parts and then cast them or use a regexp. For example for the first two fields:
input = load ... as (line:chararray);
a = foreach input generate SUBSTRING(line, 0, 9) as orderNumber:long, SUBSTRING(line, 9, 10) as origin:chararray;
This way you should be able to convert each part of the input line into the desired components.
Alternatively you could write a UDF that takes a string as input and does the splitting and returns a bag or tuple.

Resources