AtCoder Regular Contest image

B - DNA Sequence

  • image
  • The question phrase is “if rearranged,” but it is not necessary to literally rearrange, but frequency counts are sufficient.
  • Rustic writing style to make sure you’re going in the right direction. python
def solve_simple(N, S):
    from collections import Counter
    ret = 0
    for i in range(N):
        for j in range(i + 1, N + 1):
            subseq = S[i:j]
            debug("subseq", subseq)
            count = Counter(subseq)
            if count["A"] == count["T"] and count["C"] == count["G"]:
                ret += 1
    return ret
  • And then re-implement this in a fast enough way - Frequency counts are range sums So maybe a cumulative sum method? I think so.
    • But since N=5000, wouldn’t O(N^2) just barely pass? So I decided to try a simple counting method first.
    • As a result, the decision is correct and AC without using the cumulative sum python
def solve(N, S):
    from collections import defaultdict
    ret = 0
    for i in range(N):
        count = defaultdict(int)
        for j in range(i, N):
            count[S[j]] += 1
            if count["A"] == count["T"] and count["C"] == count["G"]:
                ret += 1
    return ret
- I made one mistake and RE'd the range of movement of j once.

C - Fair Elevator

  • image

  • I had no idea what to expect, so I implemented a simple depth-first search for now.

  • The sample case was correct, but the code was quite complex, and the WA was not resolved.

    • It is not a good idea to implement a halfway fast implementation. The cause of WA could not be identified.
    • A simpler, slower, and more correct code would have found discrepancies in random tests.
  • The explanation says O(N^3) in DP, attributed to the domain partitioning. - A larger structure can be created from a bilateral relationship. - Same length if they overlap → even length block

    • image
  • I didn’t think of this at all, this kind of “Power of Attribution” is missing, and I’m wondering how I can learn it.


This page is auto-translated from /nishio/ARC104 using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I’m very happy to spread my thought to non-Japanese readers.