chun 2011-12-6 16:04
[轉貼]xensource.log成長太快.
http://blog.sina.com.cn/s/blog_4ca83f830100xded.html<br><p>相信很多人被Xenserver日志填满磁盘空间,导致机器最终挂掉的问题所困扰,我遇见了2次(有一次是pool
master挂掉,200号人1个小时无法办公)。</p>
<p>前些天,又发现三个池的Pool
Master同时出现/var/log/xensource.log* 狂增的现象,差不多1小时会吃掉150MB磁盘空间</p>
<p><a href="http://photo.blog.sina.com.cn/showpic.html#blogid=4ca83f830100xded&url=http://s6.sinaimg.cn/orignal/4ca83f83gad26d92d1f55&690" target="_blank"><img name="image_operate_83741316282325811" src="http://s6.sinaimg.cn/middle/4ca83f83gad26d92d1f55&690&690" alt="不让xensource.log*搞瘫你的XenServer" title="不让xensource.log*搞瘫你的XenServer"></a></p>
<p>这样2GB的free空间,一天就会填满,Pool Master挂掉可是最要命的,500多台虚拟机将全部挂掉!</p>
<p> <wbr></p>
<p>通过下面方法暂时停止了Xensource日志</p>
<p>1、修改/etc/xensource/log.conf,注释掉下面4行(前面加个#号)<br>
<a href="http://photo.blog.sina.com.cn/showpic.html#blogid=4ca83f830100xded&url=http://s11.sinaimg.cn/orignal/4ca83f83gad26989beefa&690" target="_blank"><img name="image_operate_14791316281804178" src="http://s11.sinaimg.cn/middle/4ca83f83gad26989beefa&690&690" alt="不让xensource.log*搞瘫你的XenServer" title="不让xensource.log*搞瘫你的XenServer"></a><br>
<br>
2、重启服务前日志是不断递增的,然后用xe-toolstack-restart重启相应的守护进程和xapi服务</p>
<p><a href="http://photo.blog.sina.com.cn/showpic.html#blogid=4ca83f830100xded&url=http://s6.sinaimg.cn/orignal/4ca83f83gad26989dbbc5&690" target="_blank"><img name="image_operate_77691316281919637" src="http://s6.sinaimg.cn/middle/4ca83f83gad26989dbbc5&690&690" alt="不让xensource.log*搞瘫你的XenServer" title="不让xensource.log*搞瘫你的XenServer"></a><br>
<br>
3、xensource.log*停止了增长</p>
<p><a href="http://photo.blog.sina.com.cn/showpic.html#blogid=4ca83f830100xded&url=http://s15.sinaimg.cn/orignal/4ca83f83gad26988ccb3e&690" target="_blank"><img name="image_operate_87131316281925556" src="http://s15.sinaimg.cn/middle/4ca83f83gad26988ccb3e&690&690" alt="不让xensource.log*搞瘫你的XenServer" title="不让xensource.log*搞瘫你的XenServer"></a><br>
<br></p>
<p> <wbr></p>
<p> <wbr></p>
<p>观察了一下日志内容均为程序异常debug信息,开了个Case,现在尝无结论。</p>
<p>------------------------------</p>
<p>[20110913T14:49:03.922Z|debug|hqcnctxxsd022l|2627576
inet_rpc|dispatch:host.get_uuid D:08622df260d0|backtrace] Raised at
pervasiveext.ml:26.22-25 ->
server_helpers.ml:152.10-106 ->
server.ml:10336.19-171 ->
server_helpers.ml:118.4-7</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|xapi] Raised at
db_cache_types.ml:75.27-76 ->
db_cache_types.ml:118.2-40 ->
pervasiveext.ml:22.2-9</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|backtrace] Raised at
pervasiveext.ml:26.22-25 -> db_actions.ml:4271.26-67
-> rbac.ml:227.16-23 ->
rbac.ml:236.10-15 -> server_helpers.ml:74.11-23</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|dispatcher]
Server_helpers.exec exception_handler: Got exception
HANDLE_INVALID: [ host; OpaqueRef:NULL ]</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|dispatcher] Raised at
string.ml:150.25-34 -> stringext.ml:108.13-29</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|backtrace] Raised at
string.ml:150.25-34 -> stringext.ml:108.13-29</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|xapi] Raised at
server_helpers.ml:92.14-15 ->
pervasiveext.ml:22.2-9</p>
<p>[20110913T14:49:03.962Z|debug|hqcnctxxsd022l|2627576
inet_rpc|host.get_uuid D:f8d52f8467ec|xapi] Raised at
pervasiveext.ml:26.22-25 ->
pervasiveext.ml:22.2-9</p><br>