No Comments on 换用Cloudflare当DNS



      No Comments on 翻译:PreCertificates(前证书)是个啥?

TBSCertificate:To-Be-Signed, 将要被颁发的证书,https://stackoverflow.com/questions/5304531/certificate-structure
CT:Certificate Transparency,证书透明度
On to PreCertificates...
PreCertificates are defined in section "3.1. Log Entries" as (text trimed by me) "The Precertificate is constructed from the certificate to be issued by adding a special critical poison extension to the end-entity TBSCertificate". Then it describes how it can be produced and it is mentioned throughout the spec in many places.
rfc6962 3.1节称,“前证书是”TBSCertificate前部加一个关键扩展,描述证书如何生成。RFC接下来用了很多这个名词。
A PreCertificate is a essentially a certificate signed with one of two options:
1. PreCertificates signed by the real CA.
This sounds very dangerous as will break the fundamental X.509 rule of unique issuerDN/serialNumber pairs. The consequences of having two "certificates" with the same issuerDN/serialNumber in the wild can not possibly be estimated, making this practice quite dangerous imho.
2. PreCertificates signed by a separate PreCertificate signing CA, which is a SubCA to the real signing CA. This is a less scary, since it is normal practice that different CAs can issue certificate with the same subjectDN/serialNumber, just not the same issuerDN.
2. 用专门的根CA下属的前证书CA签发。
The actual implementation of issuing PreCertificates makes it quite impractical. I would believe that most CA implementations creates theTBSCertificate as part of the actual certificate issuance. The CA will not create the TBSCertificate to have is lying around for a couple of days before using it to issue the real certificate.
Thus, if the CA is to create a PreCertificate to send to the CT log, it might as well issue the real certificate and send it to the log. The time difference should be in the milliseconds for most CAs.
If the CA wants to wait before distributing the real certificate, to make sure it's in the logs before put into production, it can surely do so as well.
The PreCertificate imho suffers from several complicating factors for implementers, both on the CA and the CT log side. The TBSCertificate must have a poison extension inserted, and removed, effectively re-encoding the ASN.1 TBSCertificate several times, all these are points of failure.
The reason for PreCertificates are not clearly explained. Why would you want to use PreCertificates?
Fine combing through the spec gives me some ideas on why, for example to be able to embed the Certificate extension from PreCertificate CT logs in the final certificate (section 3.3). But the the TBSCertificate of the PreCertificate is then no longer the real TBSCertificate? In that case, why is the PreCertificate the TBSCertificate at all, and not just a new data structure with the data the CT log wants?
The PreCertificate complicates the CT spec by orders of magnitude, which is not a good thing. There are so many ifs and buts about PreCertificate the RFC is not even itself consitent about what it is.
Ok, I know the PreCertificate is is optional, but the best standards, who gets fast, wide and robust deployment, are the simpler ones (KISS). Skipping PreCertificates from the CT spec makes it so much simpler.
My suggestion:
- Skip PreCertificates altogether
- 别用前证书了
I see though why people will not accept that just because I say something...so in that case
- Explain the purpose behind PreCertificates well
- Describe what the actual information fro PreCertificate are used
- Be consistent throughout in the RFC

- 解释前证书的含义
- 具体描述前证书的内容
- 在RFC里说人话
Feel free to contact me at tomas a t primekey dot se.

舰娘统计 helper functions

      8 Comments on 舰娘统计 helper functions
def get_max_time(normal = '', large = ''):
    normal_tuple = normal.split()
    large_tuple = large.split()
    max_time = 100  #资材100x
    for i, j in zip(normal_tuple, large_tuple):
        time_this = int(int(j) / int(i))
        max_time = min(time_this, max_time)
    return max_time
def get_max_time_round(normal, large):
    #normal_tuple = normal.split()
    #large_tuple = large.split()
    max_time = 20  #资材20
    for i, j in zip(normal, large):
        time_this = int(int(j) / int(i))
        if 1 < abs(max_time - time_this) < 2:  #差一次没啥
            max_time = int((max_time + time_this) / 2)
            max_time = min(max_time, time_this + 1)
    return max_time
def calc_difference(normal = '', large = '', max_time = 0):
    normal_tuple = [int(i) for i in normal.split()[:5]]
    large_tuple = [int(i) for i in large.split()[:5]]
    if max_time == 0:
        max_time = get_max_time_round(normal_tuple, large_tuple)
    #list_this = []
    print('\t'.join([str(max_time)] + [str(i * max_time - j) for i, j in zip(normal_tuple, large_tuple)]))
def calc_difference2(normal = '', large = '', max_time = 0):
    normal_tuple = [int(i) for i in normal.split()[:5]]
    large_tuple = [int(i) for i in large.split()[:5]]
    if max_time == 0:
        max_time = get_max_time_round(normal_tuple, large_tuple)
    #list_this = []
    return '\t'.join([str(max_time)] + [str(i * max_time - j) for i, j in zip(normal_tuple, large_tuple)])
def get_chance(chance_normal, time):
    print(str(round(100 - ((1 - chance_normal / 100) ** time) * 100, 2)) + '%')
def get_chance2(chance_normal, time):
    return str(round(100 - ((1 - chance_normal / 100) ** time) * 100, 2)) + '%'
def calc_chance(start, end, chance):
    for i in range(start, end + 1):
        print(get_chance2( 0.93, i))
def calc_difference_multiple(normal = '', large = '', max_time = 0):
    normal_tuple = [int(i) for i in normal.split()[:5]]
    large_tuple = [int(i) for i in large.split()[:5]]
    if max_time == 0:
        max_time = get_max_time_round(normal_tuple, large_tuple)
    for i in range(max_time - 2, max_time + 1):
        print(calc_difference2(normal, large, i))


Note: Python: Follow 301 redirect

      No Comments on Note: Python: Follow 301 redirect
import urllib.parse as urlparse
import http.client as httplib
def resolve_http_redirect(url, depth=0):
    if depth > 10:
        raise Exception("Redirected "+depth+" times, giving up.")
    o = urlparse.urlparse(url,allow_fragments=True)
    conn = httplib.HTTPConnection(o.netloc)
    path = o.path
    if o.query:
        path +='?'+o.query
    conn.request("HEAD", path)
    res = conn.getresponse()
    headers = dict(res.getheaders())
    if 'Location' in headers and headers['Location'] != url:
        return resolve_http_redirect(headers['Location'], depth+1)
        return url



      No Comments on 笔记:找视频网站解析的思路


  •  直接网页截取?(FC2Video)
    • 用exec直接变dict?(pixnet)
      • encode是否全替换?(pixnet)
  • Flash引出的API?(很多,例如,Bilibili)
    • API参数是否可调?(pixnet,fun.tv)
      • 参数可否不加?(pixnet,Bilibili)
      • key是否可以申请?(VImeo,Bilibili)
    • 是否涉及hash?(绝大多数)
      • 反编译源文件?(iQiyi and shitloads of them ,letvcloud)
  • HTML5?(Weibo Miaopai)
    • API?移动端?(Fun.tv)
    • API是否相同?(Fun.tv,Letvcloud)
    • 是否某些加密不可进行?(Letvcloud)
  • 移动网站?(Pandora)
    • 是否直接页内?(Pandora)
    • iOS?Android?
    • 清晰度是否相同?
  • 移动端?
    • 是否有API?(Vimeo,Bilibili)
      • 是否HTTPS?(Vimeo)
        • 可否MITM/SSLStrip?(Vimeo)
    • 是否需要hash?(很多)
      • Android反编译?(另一个东西,Chrome DCP Srandalone https://github.com/cnbeining/Chrome-Data-Compression-Proxy-Standalone-Python 用了)
  • 反加密?
    • 特殊工具?(Letvcloud,Bilibili)
    • 其他工具/网站的提示?
    • 特殊referer/UA?特殊XFF/X-Real-IP?(Bilibili)
    • 反侦察?(iQiyi)


      No Comments on 重装系统



      1 Comment on 立此存照

Two days ago the police came to me and wanted me to stop working on this. Today they asked me to delete all the code from GitHub. I have no choice but to obey.
I hope one day I'll live in a country where I have freedom to write any code I like without fearing.
I believe you guys will make great stuff with Network Extensions.
Lest we forget.